A class or method to convert integer and float to scientific pitch notation?

I started using sclang and lilypond 10 years ago, but never felt the need to expand it into a complete system like music21 and other huge projects. I’ve been also using a pipe between sclang and openmusic, and from there I export music segments and materials into finale. I’ve composed several works this way as well

A piece to illustrate that (I used this process to compose the score, the live-electronics is also supercollider): Bernardo Barros - Attrito [2018] | International Contemporary Ensemble feat. David Fulmer on Vimeo

My experiments with Haskell are kind of recent, I felt that it can offer something different, how to represent it, how I can manipulate the material, and so on, and so on.

The most interesting part, and you should know this, is precisely after that, how to manipulate these materials. I’m always thinking of new ways, it has nothing against the sclang language. )))

2 Likes

Thanks for the great piece!

To be honest, I have not seriously used computer-aided algorithmic composition in my work, although many parts of my work tend to have an algorithmic character.
I did try something in OpenMusic, but that was just an experiment.

I did it several times for research, and the following is one of them: hommage à l'impressionnisme
In these experiments, I thought it would be nice if I could get a score from these codes instead of a spectrum in an audio file editor.

The reason I want to implement this in sclang at the moment is firstly for my students who are not musicians.

When I finish implementing the musicXML file creator for sclang, I will be able to use powerful algorithmic composition for my own work!

I have used MIDI to transfer musical information, but I think musicXML is better.

1 Like

Thank you for listening!

)))

Now the following notation is accepted. This new notation is a bit similar to Panola by @shiihs.

(
~score9 = [
	(   title: 'untitled', composer: 'me', rights: '©'),
	(
		bar: 1,
		p1: (
			lbl: \,
			atr: (
				key: [1, \major],
				time: [4, 4],
				staves: 1,
				clef: [[\g, 2]]
			),
			v1: [
				"t 120",
				"a4 /e/3/2/e p (", // /: tuplet; (: slur start
				"| //x/3/2/x f",   // |: the same pitch(es) as before //: nested tuplet
				"| //e/ f",        // //e/: nested tuplet end
				"| /5/ j",         // /5/: tuplet end
				"| /e../6/4/x",    // e.. double-dotted eighth
				"| /t sf",
				"| /x s2",
				"| /4/ sf j",      // j: tie start
				"| /e./5/4/x J",   // J: tie end
				"aqs4 /e/ sf j",
				"| /e./3/2/e J",
				"| /t/ ff j"
			]
		)
	),
	(
		bar: 2,
		p1: (
			v1: [
				\s1,
				'aqs4 /e/3/2/e J',
				'| /e',
				'| /5/',
				'a4 /x/5/4/x',
				'| /x',
				'\\', // the repetition of the same entry
				'| //x/3/2/x',
				'| //x',
				'| //x//',
				'| x.',
				'| t',
				'| x..',
				'| i',
				'| e...',
				'| i )' // slur end
			]
		)
	)
];

~exportXML.(~score9, "~/Downloads/nested tuplets string or symbol.musicxml".standardizePath, 'MuseScore 4')
)

If you think the abbreviation is not good, I will change the rhythmic part to be the same as in lilypond.
I find it confusing that the accidental notation follows lilypond, but the octave notation follows scientific pitch notation.
I have also looked at lilypond notation, guido notation and abc notation. None of them use scientific pitch notation octave numbers. I have no problem using Helmholtz pitch notation, but feel a bit strange using it with an English based programming language like sclang.

Now the following notation is accepted:

(
~score10 = [
	(   title: 'untitled', composer: 'me', rights: '©'),
	(
		bar: 1,
		p1: (
			lbl: \,
			atr: (
				key: [1, \major],
				time: [4, 4],
				staves: 1,
				clef: [[\g, 2]]
			),
			v1: "
t=120
<a4=e5=a5>=/e/3/2/e=p=(
|=//x/3/2/x=f
|=//e/=f
|=/5/=j
|=/e../6/4/x
|=/t=sf
|=/x=s2
|=/4/=sf=j
|=/e./5/4/x=J
aqs4=/e/=sf=j
|=/e./3/2/e=J
|=/t/=ff=j
"
		)
	),
	(
		bar: 2,
		p1: (
			v1: "
s1
aqs4=/e/3/2/e=J |=/e |=/5/ 
a4=/x/5/4/x |=/x \\ |=//x/3/2/x |=//x |=//x// 
|=x. |=t |=x.. 
|=i |=e... |=i=)
"
		)
	)
];

~exportXML.(~score10, "~/Downloads/nested tuplets string.musicxml".standardizePath, 'MuseScore 4')
)

I am not sure if = used for combining duration, articulation etc. to the pitches is good…
Moreorver, q, e, x, t and i for rhythmic value do not seem to be understandable as 4, 8, 16, 32 and 64…

Any opinions?

1 Like

I do find the = hard to read - I think it was easier to read with spaces. _ would be ok but a little annoying to type.

some thought about tuplets:

If I understand it currently /e/3/2 is used to indicate an eigth note triplet - so /e/3/2/e both begins the triplet and indicates the first eighth note. and later // is attached to the final note of the triplet.

it shouldn’t be necessary to indicate the duration of a tuplet if you are also indicating the end and the ratio. So /3/2 … // would be sufficient.

But then why not go farther and assume 3:2 if only 3 is written (as is done in sheet music). so 3 would imply 3:2 5-> 5:4 and 6-> 6:4 so you could write

/3 ... //

I also like the idea of expressing the ratio with a : since there are so many / everywhere!

/5:7 ... //

but it would be even more readable to use a square bracket for tuplets, something like a5/e[3 /e /e ] or a5/e[5:7 /e /e /e /e/f]

1 Like

@semiquaver
Thank you for your feedback!

About Separator

The following are alternatives:

<a4=e5=a5>=/e/3/2/e=p=(
<a4`e5`a5>`/e/3/2/e`p`(
<a4,e5,a5>,/e/3/2/e,p,(
<a4-e5-a5>-/e/3/2/e-p-(  // - is used for articulation (tenuto).
<a4;e5;a5>;/e/3/2/e;p;(
<a4\e5\a5>\/e/3/2/e\p\(
<a4\e5\a5>_/e/3/2/e_p_( // _ is used for articulation (detached-legato).

_ is good for readability, but the shift key should be pressed, so I think it is a bit impractical. Could you choose other character than _? My mind is open to accept your opinion. If you think _ is the best one, than I will follow your mind.

About Tuplets

The following are the meanings of / and //:

  • / appended to the leftmost side of information:
    An indicator of tuplet notation.
  • / attached to the rightmost side of information:
    The note is the last note of a tuplet (a tuplet can be a nested tuplet).
  • // attached to the leftmost side of information:
    An indicator of nested tuplets.
  • // attached to the rightmost side of information:
    The note is the last note of a tuplet and also the last note of a nested tuplet.

For examples:

  • /e/3/2/e:
    the eighth note which is part of the 3:2 ratio of the eighth.
  • /e/
    the eighth note which is the last note of a tuplet.
  • /e//
    the eighth note which is the last note of a tuplet and a nested tuplet.

The following example may illustrate why the note value is needed at the end of tuplet notation:

  • /q/3/2/e:
    the quarter note, which is part of the 3:2 ratio of the eighth.
    However, /e/3/2/e can also be written as /e/3/2. I will add it! Thanks!

I also thought so, too.
/e/3:2/e is better than /e/3/2/e
OK! I can change it!

I used [] pairs for array notation of musical entries as in the example 1 in the following post:

v1: [
                [[\a7, \a6], [\e, 3, 2, \e]],    // [\a7, \a6]: a two-note chord; [\e, 3, 2, \e]: tuplet start. \e: eighth.
                [$|, [\e]],                      // $|; the same pitches as before; [\e]: tuplet continues.
                [$|, [5, \]],                    // [5, \]: tuplet ends. \: end sign, 5 = eighth.
                [[\af7, \as6], [\e, 3, 2, \e]],  // [\e, 3, 2, \e]: tuplet start.
                [$|, [[\x, 3, 2, \x]]],          // [[\x, 3, 2, \x]]: nested tuplet start; x: sixteenth.
                [$|, [[\x]]],                    // [[\x]]: nested tuplet continues.
                [$|, [[\x, \]]],                 // [[\x, \]]: nested tuplet end.
                [$|, [\e, \]],                   // [\e, \]: tuplet end.
                [$|, [\e, 5, 4, \e]],
                [$|, [[\e, 3, 2, \e]]],
                [$|, [[\q, \]]],
                [$|, [\q, \]]
            ]

In the string (or symbol) notation for musical entries, I do not use parentheses and braces to bind information.

I attach the structure of the musicXML part for the tuplet and then explain my notation details.

Below is the musicXML code for a three-note chord:

      <note>
        <pitch>
          <step>A</step>
          <alter>0.0</alter>
          <octave>4</octave>
        </pitch>
        <duration>84000</duration>              // related to non-tuplet and tuplet
        <voice>1</voice>
        <type>eighth</type>                           // related to non-tuplet and tuplet
        <accidental>natural</accidental>
        <time-modification>                            // related to tuplet
          <actual-notes>3</actual-notes>       
          <normal-notes>2</normal-notes>
          <normal-type>eighth</normal-type>
        </time-modification>
        <staff>1</staff>
        <notations>
          <tuplet type="start" bracket="yes" number="1"> // related to tuplet
            <tuplet-actual>
              <tuplet-number>3</tuplet-number>
              <tuplet-type>eighth</tuplet-type>
            </tuplet-actual>
            <tuplet-normal>
              <tuplet-number>2</tuplet-number>
              <tuplet-type>eighth</tuplet-type>
            </tuplet-normal>
          </tuplet>
        </notations>
        <notations>
          <slur number="1" type="start"/>
        </notations>
      </note>
      <note>
        <chord/>
        <pitch>
          <step>E</step>
          <alter>0.0</alter>
          <octave>5</octave>
        </pitch>
        <duration>84000</duration>
        <voice>1</voice>
        <type>eighth</type>
        <accidental>natural</accidental>
        <staff>1</staff>
      </note>
      <note>
        <chord/>
        <pitch>
          <step>A</step>
          <alter>0.0</alter>
          <octave>5</octave>
        </pitch>
        <duration>84000</duration>
        <voice>1</voice>
        <type>eighth</type>
        <staff>1</staff>
      </note>

There are two (or three) components that define rhythm:

  • duration
  • note type
  • dot if present

There are many components to define a tuplet, but the core components are the following three in time modification:

  • actual-notes
  • normal-notes
  • normal-type

The other components can be constructed from these three.

Nested tuplets are computed from these components and the tuplet number.

In the string (or symbol) notation of musical notation, / is the tuplet indicator for start (for the start of tuplet information) and for end (for the end of tuplet information).

Thus, the first note of a tuplet requires the following four things, none of which can be currently omitted in either notation:

  1. /q/3/2/e: note-type / tuplet actual-note / tuplet normal-note / tuplet normal-type
  2. [\q, 3, 2, \e]
  3. /e/3/2/e: note-type / tuplet actual-note / tuplet normal-note / tuplet normal-type
  4. [\e, 3, 2, \e]

← In the case of 3 and 4, the last note value can be omitted. I will add it!

The last note of a tuplet should indicate that it is the last note of the tuplet. This is indicated by \ for array notation of musical entries and / for string (or symbol) notation of musical entries. They come at the end of the tuplet information, as follows

  • /e/
  • [\e, ]

The notes between the first tuplet note and the last tuplet note need no information, so they need their rhythmic values:

  • /e
  • [\e]

Now tuplet notation is simplified!
Thanks @semiquaver
Now the following notation is accepted (` is a provisional way):

(
~score10 = [
	(   title: 'untitled', composer: 'me', rights: '©'),
	(
		bar: 1,
		p1: (
			lbl: \,
			atr: (
				key: [1, \major],
				time: [4, 4],
				staves: 1,
				clef: [[\g, 2]]
			),
			v1: "
t`120
<a4`c5`a5>`/e/3:2`p`>`(
|`//x/3:2`f`.
|`//e/`f`!
|`/5/`j
|`/e../6:4/x
|`/t`sf`_
|`/x`s2`-
|`/4/`sf`^
|`/e./5:4/x
aqs4`/e/`sf`j
|`/e./3:2/e`J
|`/t/`ff`j
"
		)
	),
	(
		bar: 2,
		p1: (
			v1: "
s1
aqs4`/e/3:2/e`J |`/e |`/5/ a4`/x/5:4/x |`/x \\ |`//x/3:2/x |`//x |`//x// |`x. |`t |`x.. |`i |`e... |`i`)
"
		)
	)
];

~exportXML.(~score10, "~/Downloads/nested tuplets string.musicxml".standardizePath, 'Dorico 4')
)```

I want to understand something here. Is the intention to add this functionality to the core class library, or to develop a quark?

1 Like

I can’t speak for OP but I would hope that this project would be a Quark (of course almost everything will be a Quark if the grand refactoring comes to be.:wink: )

OPs work has spurred discussion of a narrower idea of implementing some kind of PitchClass as part of the core library see discussion here Note Names / Pitch Classes and here Class Library Developer Group - #38 by mike

1 Like

@nathan @semiquaver
I think this functionality should be added to the core class library.

Currently, sclang’s musical notation is available through LilyPond’s implementation of the following Quarks:

  • fosc, ( ← works on my end. )
  • Superfomus,
  • LilyCollider.

SuperCollider’s speciality is algorithmic composition, but it does not provide musical notation and pitch naming conventions like scientific notation. This is strange to me.

I have almost achieved this functionality with

  • exporting a musicXML file and opening it in a music notation program,
  • creating an SCD file with all the abbreviated notation written in standard notation, and automatically detecting and parsing the tied notes for playback with Synth, and
  • automatic playback of the SCD file and opening it in SC-IDE.

I hope to upload the package as soon as possible, and will ask the development group to include the package in the core class library.

The problem is that the class I am writing is long, and I am not sure that the structure of the code is optimal in terms of using memory and CPU resources. Someone should revise it.

Not being an SC dev anymore, I hope that providing armchair opinions in the absence of other contributions doesn’t come off as rude, but I’m strongly opposed to this going into core. A problem that I’ve mentioned concerning SC’s longevity is that the core library has far too many components it doesn’t need. Putting anything in core is a big responsibility; as long as backward compatibility is a concern, it has to be maintained forever, and the development cost has to be justified. Frankly I don’t think it is in this case; this is just way too specific.

2 Likes

You’ve prototyped a DSL for music notation, which is invariably fun but also very specific.

Either way, for your code to integrate into the SC ecosystem, you must develop an OOP design for the elements (pitches, accidents, durations, tuplets, ties, dynamics, etc.). Perhaps the most logical step would be to think of a quark that exactly mirrors the elements of musicXML. No need to build a complex system at this point. Keep it simple there.

Then, in a separate quark, you implement your DSL, that will combine all these objects.

You already made a proof-of-concept, it works. Well done!

2 Likes

Hi @rdd

I found your stuff on rhythms quite exciting. It seems that this relationship between duration and traditional notation (tree) is a very slippery arithmetic. Not every rational number can be expressed as a duration; at the same time that of the rational numbers accepted by this arithmetic, some correspond to only one duration, while others can have 2, 3, or 4 different durations. Verifying this with QuickCheck tests (durationToRq . rqToDuration :: Duration → [Duration] ) was curious:

https://gist.github.com/smoge/a0b9bfd19778fccbdb88090bb9a438c9

Strangely, some notation systems (such as GUIDO) only accept the Rq values, and it does not seem correct that it is possible to generalize an inference. Therefore, these systems will fail in less simple cases. (Correct me if I’m wrong.) As far as I understand, both representations are complementary.

I want to delve deeper if you have more references on this. I also see you have a lot of exciting stuff there, from Barlow to Larry Polansky and many others. I didn’t have time to check it out yet.

1 Like

My programming skills are not professional. I cannot always be responsible. The first step in publishing should be to use Quarks, unless some people in the development department check and repackage my code.

While developing my idea, I also think transposition. For example,

\a4 + \p5 // midi pitch number + perfect fifth

returns:

e5

With OOP design, it would be as follows:

A4.() + P5.() // midi pitch number + perfect fifth

returns:

A5

Yes, I also think it! This is why I proposed the following classes (I changed the name as Notator):

However, there are some problems here:

  1. Rest is already a subclass of Operand. So, I think the Note class and its subclasses should exist as the subclasses of Operand.
  2. Then, SPN and PitchClass by @josh should also be under Operand?
  3. The notation of such musical information using OOP design seems to lose readability… The code seems to be very complicated.

The code snippet you’ve shared doesn’t seem to follow a typical object-oriented design as in sc. In this kind of operation you posted, you’d have separate Pitch and Interval classes and functions defined between them. If you define classes for pitches and intervals, their interactions can be considered vector space arithmetic — or, more simply, “affine space”. For instance, you could subtract one pitch from another to derive an interval or add an interval to a pitch to determine a new pitch (as demonstrated in your example).

https://en.wikipedia.org/wiki/Affine_space 1

I imagine A4.() + P5.() are just pitch literals operating with interval literals, but even then, it’s not a very convenient literal.

How you will implement this can be a separate code from the code to create musicxml. This way, it will be more modular and more helpful to others. For example, I don’t see the need for an interval class in the musicxml quark.

2 Likes

Yes, it’s not a linear relation, c.f. rq_to_duration at:

https://gitlab.com/rd--/hmt/-/blob/master/Music/Theory/Duration/Rq.hs#L65

There are lots of nice notations for duration, the “Bel” notations, for instance, are rather nice:

https://gitlab.com/rd--/hmt/-/blob/master/Music/Theory/Time/Bel1990/R.hs#L78

1 Like

I made two classes SPN and PitchClassSet with help documents that can be used with the SC-IDE help document browser.

https://scsynth.org/t/note-names-pitch-classes/8061/3?u=prko

My PitchClassSet class is independent of @josh’s PitchClass class.

Any comments are welcome, but we should discuss the subject on the thread which @semiquver made. That thread does not include musicXML-related perspectives.

Thank you for all of your feedback.

@rdd

Thank you for your opinion. I am also speculative to apply this notation (A4.(\eighth) etc.) It is complicated as my expectation.

I should have time to reflect the concept in the articles you mentioned (Affine Space, Representing Music with Prefix Trees, and the codes of @rdd). By August I should have finished what I did over the summer. My work process is not that fast due to many gaps in programming skills.

I have made a Quark library related to this thread. It seems to work with the examples given, but needs further work. SPN and PitchClassSet are in this quark library.

Detailed help documents are provided for each class, as well as a Notator score guide!

1 Like