A while back, I ran into difficulty: MIDIOut synchronizes messages with server synths by making “latency” a property of the MIDIOut port. VSTPluginMIDIProxy imitates MIDIOut’s interface, but doesn’t implement latency. This complicates the building of higher-level abstractions, requiring two versions (one for MIDIOut, the other for VSTPlugin).
I took this as an opportunity to consider alternate designs. I’d love to hear comments on this approach.
The design I’m considering now (see this gist) looks like this:
VSTPluginMIDISenderis the transport mechanism.
MIDISenderwould follow MIDIOut conventions for selecting and connecting to ports (
VSTPluginMIDISendertalks to a
MIDI...Messageobjects encapsulate the message types.
m = MIDISender(0).connect(1); // or VSTPluginMIDISender(controller) n = MIDINoteMessage(device: m); // note: 2 is a dur parameter // automatic note-off! (nil duration suppresses auto-release) n.play(60, 64, 2);
An earlier version of this approach required you to put the MIDI parameters into the object first, and then
play(midisender, latency), but I felt pretty quickly that this was a bit clumsy to use. So I tried inverting it: midisender and latency are properties of the message object, and
play accepts new data.
// yikes, I like OOP but this is a bit silly n.dataA_(notenum).dataB_(velocity).play(m, 0); // or n.play(notenum, velocity);
Usage code can then retain references to MIDI…Message objects instead of output ports.
A fun new feature – MIDI…Messages can multichannel expand!
n.play([60, 64, 67], 64, 2) // play a triad! MIDIControlMessage([... 20 ccnums...], [... 20 values...], device: m).play;
(All of these examples are currently working in my prototype.)
Anticipating some questions:
“Why not just make latency a property of VSTPluginMIDIProxy?” I did discuss that with spacechild1… relevant replies to me: “MIDIOut.latency is only a workaround to (roughly) synchronize a MIDI device with the Server’s audio output. Why should I adopt it for VSTPluginMIDIProxy when it’s not necessary at all?” “After all, (external) MIDI devices are different from scsynth. Of course, you can try to hide the differences behind a common interface, but when the abstraction becomes leaky or starts to break down, it’s better to stop and take a step back.” (Taking a step back is exactly what I’m doing here.)
“Why not just add
latencyto noteOn, noteOff etc. methods?” Hmmmm… gut feeling? Also… if
noteOnhas a latency argument, then
writeneeds it too… then, what about the instance variable? I guess it’s not terrible to say that
m.latencyrepresents a default timing parameter, which can be overridden per message. But it smells a little funky to me.
- Aside: This type of question/criticism might really be motivated by “I like the current way and don’t want it to change too much” – which is understandable, but doesn’t say anything about which programming interface is better (or why one interface might be better for some circumstances).
I could either put this out as a quark, or propose it for the main library.