Workflow from .Midi File (Musescore into SuperCollider)

I would like to generate a .midi file from composed sheet music (ex. Musescore) then send the .midi file into Supercollider. I want to be able to do the following with it

  1. Access different bars/sections of the piece (ex. only play bars 2-4)
  2. Assign different synths to different parts (the bass part should be synthA and the melody synthB or use synthA for bars 2-4 then synthB for bars 5-7.
  3. Ideally use different .midi files at once so I could have .midi pattern 1 and .midi pattern 2.
  4. Write a trigger to affect different parts. ex (if a=1 do this, if a=2 do this)

I tried the SimpleMIDIn but I found it slightly confusing to work with beyond importing the .midi into Supercollider. I would appreciate any assistance you can offer on these issues.

SimpleMIDIFile does the hard/annoying work of unpacking the bytes in the MIDI file format, but it does almost nothing to dress up the data in a musically useful way. I’m not aware of an existing SC class that parses a MIDI file into time units… so you might have to do it on your own.

This is, at root, a data representation problem. SimpleMIDIFile gives you the deltas (IOIs = inter-onset intervals) between MIDI messages but this doesn’t represent notes, or their metrical positions within bars, in an ideally useful way. After reading, you would have to post-process the SimpleMIDIFile data:

  • Accumulate time: “now” is the sum of all previous IOIs. Store this absolute time with every MIDI event. With this absolute time, you can easily locate the start of bar 2 etc. (or further subdivide the data structure – maybe instead of a flat list of notes with onset times, you might want an array of bars – this is up to you, whatever is convenient for your usage).

  • Note off messages should search backward for the latest note-on with the same pitch. Then the note duration is “now” minus the note-on’s absolute onset time. Might be good to store the absolute release time as well, to help find notes that are holding over from before.

Divide up the data based on MIDI channel. I think SimpleMIDIFile has methods for this.

Once you’ve acquired data from multiple MIDI files, you can use separate players to play lines from them simultaneously, or even merge them. After they’re in SC, the data are just data; you can do what you want with them.

Certainly possible in SC, but not specifically about MIDI.

hjh

Maybe more basically

I think after looking into this further another way to ask this question might be. How to extract the pitch/rythm data from a midi into an array
ex.
import midiFile
Pseq([notesFromMIDIfile])
\dur, Pseq([RhythmFromMIDIFile]

Do you have any suggestions on doing this?

Programming boils down to three questions:

A. What data do I have?
B. What data do I want?
C. What sequence of operations will transform 1 into 2?

C is, of course, a very complex question :laughing: but I like to state it in this way because it’s usually impossible to start to answer C without knowing A and B (but it’s very tempting to just dive into coding without understanding A and B, which is a great way to waste time).

“How to extract the pitch/rythm data from a midi into an array”

A. Data we have – first step is to look at the data format in SimpleMIDIFile.

m = SimpleMIDIFile.read("~/share/sc3-plugins/external_libraries/stk/projects/examples/midifiles/tango.mid".standardizePath);

m.tracks;  // 17

m.tempo;  // 127.00025400051
m.division;  // 384 ticks per quarter

m.noteEvents(0, 0) // nothing -- reserved for meta-events
m.noteEvents(0, 1) // a lot...

-> [ [ 1, 1296, noteOn, 0, 50, 39 ], [ 1, 1296, noteOn, 0, 38, 39 ], [ 1, 1460, noteOff, 0, 38, 0 ], [ 1, 1468, noteOff, 0, 50, 0 ], [ 1, 1848, noteOn, 0, 57, 51 ], [ 1, 1936, noteOff, 0, 57, 0 ], ...]

So for notes, we see arrays: [ 1, 1296, noteOn, 0, 50, 39 ] means track number 1, time = 1296 (in clock ticks, 384 ticks = 1 quarter note), event type = note-on, channel number 0, note number 50, velocity 39.

Now… how long does this D natural last? Well… that MIDI event doesn’t tell you. The end time of this node is determined by the next noteOff with the same note number.

Oh, reading the help file further, there is a noteSustainEvents method that will include a duration:

m.noteSustainEvents(0, 1);
-> [ [ 1, 1296, noteOn, 0, 50, 39, 172, 0 ], [ 1, 1296, noteOn, 0, 38, 39, 164, 0 ], [ 1, 1848, noteOn, 0, 57, 51, 88, 0 ], [ 1, 1996, noteOn, 0, 69, 58, 52, 0 ], ...]

So that note 50 takes 172 ticks. This method would be more useful.

B. Data we want – This is up to you, depending on your requirements.

I would suggest that separating notes from the rhythm will create complication later. I think it would be better to produce an array of Event objects, which you can then stream out as whole events with Pseq. (It’s a common misconception with patterns that you must have a Pbind to produce events.)

So maybe the output format is like (midinote: ..., velocity: ..., amp: ..., dur: ..., sustain: ..., absTime: ...).

C. How to change the noteSustainEvents into that? Well, the noteSustainEvents (in theory) are already pretty close. Maybe a simple collect would do it…

(
f = { |smf, channel, track|
	var tempo = smf.tempo;
	var ppq = smf.division;
	var lastTime = 0;
	
	smf.noteSustainEvents(channel, track).collect { |array|
		var event = (
			midinote: array[4],
			velocity: array[5],
			amp: array[5] / 127,
			dur: (array[1] - lastTime) / ppq,
			absTime: array[1] / ppq,
			sustain: array[6] / ppq
		);
		lastTime = array[1];
		event
	};
};
)

a = f.(m, 0, 1);

// in theory Pseq(a, 1).play should work
// but see below about the time values, which might need some massaging

but this is giving me quite weird time values that seem not exactly related to the meter (or, I don’t understand yet what the is relationship between ticks and metrical note values – I’m seeing 64th notes being returned here, from this MIDI file – maybe yours would be simpler). I’m basically out of time for now (this is already a long post), so you would have to look into that yourself, or maybe someone else is interested and can help you.

With the absTime in the events, then it would be possible to scan through the array to find specific beats, etc…

hjh