Dorico does not indicate ‘double-sharp-down’. I think this is OK, because I think ‘c with double-sharp-down’ should be notated as ‘d with natural-down’. So my new function will do this automatically when entering pitches by MIDI note pitch numbers, as in the example code above. Please let me know if these notations are acceptable. If not, I could change it, or write a subfunction to decide the enharmonic spelling by looking at the next pitch.
Thanks again! I think writing some classes with methods will not be too difficult once the basic things about pitch notation and rhythmic notation are fixed.
Although I think the rhythm part still needs some work to have the basic functionalities as the pitch materials. There’s where some tricky things start to appear.
Let’s keep this conversation going on here or GitHub (or zoom?) with some people that do some work along those lines.
Thanks for pointing that out! I will correct my example with a comment and thanks to you!
Glad to hear it.
That is my intention! SC users do not need to know MusicXML. My intention is to let sc-users create their own score using events and arrays according to the notation guidelines. My function (or later my class) will convert it into a musicXML file, and the sc-user can see his score in his preferred notation software!
What I need to hear might be the convention of the notation guideline from different perspectives.
Yes, the rhythmic part should be reworked.
I intend to implement triple dots. (Dots, Double dots are currently implemented.)
I think the easiest way is to write the music in ‘senza-misura’ and then let the notation software rebar the music. Otherwise, the user can copy and paste the music into the bars defined by the time signature.
But simply adding triple dots is not enough!
Could we discuss this on this forum first?
Zoom may not be as productive as my English is not that fluent, especially in listening and speaking. Github is also a good place to do it, but I think a draft of some classes should already be done for it. I will try to attend the next developer meeting, even if my English is not good enough. Let’s talk about moving to gitHub around the next developer meeting! I think a draft of the classes will be ready in a few weeks.
I’m busy with other things, but I have fun playing around with similar things in Haskell. I’ve been experimenting with different approaches around the same exact problem you’re working on.
One of the solutions for durations is something like
type Division = Integer
type Dots = Int
type Rq = Rational
data Duration where
Duration :: { _division :: Division,
_dots :: Dots,
_multiplier :: Rational}
-> Duration
deriving (Eq, Show)
makeLenses ''Duration
class HasDuration a where
_duration :: a -> Duration
instance HasDuration Duration where
_duration = id
dotMultiplier :: Dots -> Rational
dotMultiplier dotCount = n % d
where
n = 2 ^ (dotCount + 1) - 1
d = 2 ^ dotCount
durationToRq :: Duration -> Rq
durationToRq d = (1 % (d ^. division)) * dotMultiplier (d ^. dots) * (d ^. multiplier)
The division is the note value. The number of dots alters the duration according to the dotMultiplier function. _multiplier is there to be used when the duration is contained inside a tuplet.
Logical ties, used to duration such as a quarter-note tied to a 32th-note, for example, is another case we need to take care of.
I’m not 100% sure how tuplets work in MusicXML, but I believe it should have a tag, and would work just like a span, such as a slur. Right? The difference is that it will be a sort of container that will alter the _multiplier for each duration. That’s one of the solutions.
Some fun edge cases are nested tuplets and metric modulations… (Lilypond supports the former nicely but I had to write some functions to display and perform the metric modulations !)
IIRC Lilypond will not automatically break up notes to flow across barlines - if you type a4 a4 a4 a2 for example it won’t split the final half note into two tied quarters.
If you are curious to play around there is an online editor for Lilypond here: https://www.hacklily.org/
Yes, I know that. But it would be nice to come up with a model that will work well with Lily ad musicxml. Maybe Abjad did a mistake mirroring LilyPond data types. As a matter of fact, MusicXML can’t be forgotten.
Right now I’m trying to figure out if MusicXML is capable of doing all those things. It’s a bit arcane and weird, and very messy. But probably not impossible to do it.
Some of my experiments (just the first sketches) are already doing the basic job, but there are always “edge cases” when we mix logical ties and tuplets.
Thanks for a lot of information.
I am ashamed that I do not understand Haskell, but I will try to learn what I can take from it.
I feel it somewhat strange that sc-users do not make the libraries to get scores for sclang, but for other languages…
Anyway…
Yes, it is!
I have implemented simple tuplets and nested tuplets. In Dorico and Finale, both are decoded and appear as expected. In MuseScore, nested tuplets are not decoded and displayed correctly.
I think the basic features are done, but I am still not sure if the way of notating the score is good enough. I have other simpler ideas to notate them… but still not sure…
I started using sclang and lilypond 10 years ago, but never felt the need to expand it into a complete system like music21 and other huge projects. I’ve been also using a pipe between sclang and openmusic, and from there I export music segments and materials into finale. I’ve composed several works this way as well
My experiments with Haskell are kind of recent, I felt that it can offer something different, how to represent it, how I can manipulate the material, and so on, and so on.
The most interesting part, and you should know this, is precisely after that, how to manipulate these materials. I’m always thinking of new ways, it has nothing against the sclang language. )))
To be honest, I have not seriously used computer-aided algorithmic composition in my work, although many parts of my work tend to have an algorithmic character.
I did try something in OpenMusic, but that was just an experiment.
I did it several times for research, and the following is one of them: hommage à l'impressionnisme
In these experiments, I thought it would be nice if I could get a score from these codes instead of a spectrum in an audio file editor.
The reason I want to implement this in sclang at the moment is firstly for my students who are not musicians.
When I finish implementing the musicXML file creator for sclang, I will be able to use powerful algorithmic composition for my own work!
I have used MIDI to transfer musical information, but I think musicXML is better.
If you think the abbreviation is not good, I will change the rhythmic part to be the same as in lilypond.
I find it confusing that the accidental notation follows lilypond, but the octave notation follows scientific pitch notation.
I have also looked at lilypond notation, guido notation and abc notation. None of them use scientific pitch notation octave numbers. I have no problem using Helmholtz pitch notation, but feel a bit strange using it with an English based programming language like sclang.
I am not sure if = used for combining duration, articulation etc. to the pitches is good…
Moreorver, q, e, x, t and i for rhythmic value do not seem to be understandable as 4, 8, 16, 32 and 64…
I do find the = hard to read - I think it was easier to read with spaces. _ would be ok but a little annoying to type.
some thought about tuplets:
If I understand it currently /e/3/2 is used to indicate an eigth note triplet - so /e/3/2/e both begins the triplet and indicates the first eighth note. and later // is attached to the final note of the triplet.
it shouldn’t be necessary to indicate the duration of a tuplet if you are also indicating the end and the ratio. So /3/2 … // would be sufficient.
But then why not go farther and assume 3:2 if only 3 is written (as is done in sheet music). so 3 would imply 3:2 5-> 5:4 and 6-> 6:4 so you could write
/3 ... //
I also like the idea of expressing the ratio with a : since there are so many / everywhere!
/5:7 ... //
but it would be even more readable to use a square bracket for tuplets, something like a5/e[3 /e /e ] or a5/e[5:7 /e /e /e /e/f]
<a4=e5=a5>=/e/3/2/e=p=(
<a4`e5`a5>`/e/3/2/e`p`(
<a4,e5,a5>,/e/3/2/e,p,(
<a4-e5-a5>-/e/3/2/e-p-( // - is used for articulation (tenuto).
<a4;e5;a5>;/e/3/2/e;p;(
<a4\e5\a5>\/e/3/2/e\p\(
<a4\e5\a5>_/e/3/2/e_p_( // _ is used for articulation (detached-legato).
_ is good for readability, but the shift key should be pressed, so I think it is a bit impractical. Could you choose other character than _? My mind is open to accept your opinion. If you think _ is the best one, than I will follow your mind.
About Tuplets
The following are the meanings of / and //:
/ appended to the leftmost side of information:
An indicator of tuplet notation.
/ attached to the rightmost side of information:
The note is the last note of a tuplet (a tuplet can be a nested tuplet).
// attached to the leftmost side of information:
An indicator of nested tuplets.
// attached to the rightmost side of information:
The note is the last note of a tuplet and also the last note of a nested tuplet.
For examples:
/e/3/2/e:
the eighth note which is part of the 3:2 ratio of the eighth.
/e/
the eighth note which is the last note of a tuplet.
/e//
the eighth note which is the last note of a tuplet and a nested tuplet.
The following example may illustrate why the note value is needed at the end of tuplet notation:
/q/3/2/e:
the quarter note, which is part of the 3:2 ratio of the eighth.
However, /e/3/2/e can also be written as /e/3/2. I will add it! Thanks!
I also thought so, too. /e/3:2/e is better than /e/3/2/e
OK! I can change it!
I used [] pairs for array notation of musical entries as in the example 1 in the following post:
In the string (or symbol) notation for musical entries, I do not use parentheses and braces to bind information.
I attach the structure of the musicXML part for the tuplet and then explain my notation details.
Below is the musicXML code for a three-note chord:
<note>
<pitch>
<step>A</step>
<alter>0.0</alter>
<octave>4</octave>
</pitch>
<duration>84000</duration> // related to non-tuplet and tuplet
<voice>1</voice>
<type>eighth</type> // related to non-tuplet and tuplet
<accidental>natural</accidental>
<time-modification> // related to tuplet
<actual-notes>3</actual-notes>
<normal-notes>2</normal-notes>
<normal-type>eighth</normal-type>
</time-modification>
<staff>1</staff>
<notations>
<tuplet type="start" bracket="yes" number="1"> // related to tuplet
<tuplet-actual>
<tuplet-number>3</tuplet-number>
<tuplet-type>eighth</tuplet-type>
</tuplet-actual>
<tuplet-normal>
<tuplet-number>2</tuplet-number>
<tuplet-type>eighth</tuplet-type>
</tuplet-normal>
</tuplet>
</notations>
<notations>
<slur number="1" type="start"/>
</notations>
</note>
<note>
<chord/>
<pitch>
<step>E</step>
<alter>0.0</alter>
<octave>5</octave>
</pitch>
<duration>84000</duration>
<voice>1</voice>
<type>eighth</type>
<accidental>natural</accidental>
<staff>1</staff>
</note>
<note>
<chord/>
<pitch>
<step>A</step>
<alter>0.0</alter>
<octave>5</octave>
</pitch>
<duration>84000</duration>
<voice>1</voice>
<type>eighth</type>
<staff>1</staff>
</note>
There are two (or three) components that define rhythm:
duration
note type
dot if present
There are many components to define a tuplet, but the core components are the following three in time modification:
actual-notes
normal-notes
normal-type
The other components can be constructed from these three.
Nested tuplets are computed from these components and the tuplet number.
In the string (or symbol) notation of musical notation, / is the tuplet indicator for start (for the start of tuplet information) and for end (for the end of tuplet information).
Thus, the first note of a tuplet requires the following four things, none of which can be currently omitted in either notation:
← In the case of 3 and 4, the last note value can be omitted. I will add it!
The last note of a tuplet should indicate that it is the last note of the tuplet. This is indicated by \ for array notation of musical entries and / for string (or symbol) notation of musical entries. They come at the end of the tuplet information, as follows
/e/
[\e, ]
The notes between the first tuplet note and the last tuplet note need no information, so they need their rhythmic values:
I can’t speak for OP but I would hope that this project would be a Quark (of course almost everything will be a Quark if the grand refactoring comes to be. )
@nathan@semiquaver
I think this functionality should be added to the core class library.
Currently, sclang’s musical notation is available through LilyPond’s implementation of the following Quarks:
fosc, ( ← works on my end. )
Superfomus,
LilyCollider.
SuperCollider’s speciality is algorithmic composition, but it does not provide musical notation and pitch naming conventions like scientific notation.This is strange to me.
I have almost achieved this functionality with
exporting a musicXML file and opening it in a music notation program,
creating an SCD file with all the abbreviated notation written in standard notation, and automatically detecting and parsing the tied notes for playback with Synth, and
automatic playback of the SCD file and opening it in SC-IDE.
I hope to upload the package as soon as possible, and will ask the development group to include the package in the core class library.
The problem is that the class I am writing is long, and I am not sure that the structure of the code is optimal in terms of using memory and CPU resources. Someone should revise it.