Rethinking MIDIOut

A while back, I ran into difficulty: MIDIOut synchronizes messages with server synths by making “latency” a property of the MIDIOut port. VSTPluginMIDIProxy imitates MIDIOut’s interface, but doesn’t implement latency. This complicates the building of higher-level abstractions, requiring two versions (one for MIDIOut, the other for VSTPlugin).

I took this as an opportunity to consider alternate designs. I’d love to hear comments on this approach.

The design I’m considering now (see this gist) looks like this:

  • A MIDISender or VSTPluginMIDISender is the transport mechanism.

    • MIDISender would follow MIDIOut conventions for selecting and connecting to ports (newByName etc.).
    • VSTPluginMIDISender talks to a VSTPluginController.
  • MIDI...Message objects encapsulate the message types.

m = MIDISender(0).connect(1);  // or VSTPluginMIDISender(controller)

n = MIDINoteMessage(device: m);

// note: 2 is a dur parameter
// automatic note-off! (nil duration suppresses auto-release)
n.play(60, 64, 2);

An earlier version of this approach required you to put the MIDI parameters into the object first, and then play(midisender, latency), but I felt pretty quickly that this was a bit clumsy to use. So I tried inverting it: midisender and latency are properties of the message object, and play accepts new data.

// yikes, I like OOP but this is a bit silly
n.dataA_(notenum).dataB_(velocity).play(m, 0);

// or
n.play(notenum, velocity);

Usage code can then retain references to MIDI…Message objects instead of output ports.

A fun new feature – MIDI…Messages can multichannel expand!

n.play([60, 64, 67], 64, 2)  // play a triad!

MIDIControlMessage([... 20 ccnums...], [... 20 values...], device: m).play;

(All of these examples are currently working in my prototype.)

Anticipating some questions:

  • “Why not just make latency a property of VSTPluginMIDIProxy?” I did discuss that with spacechild1… relevant replies to me: “MIDIOut.latency is only a workaround to (roughly) synchronize a MIDI device with the Server’s audio output. Why should I adopt it for VSTPluginMIDIProxy when it’s not necessary at all?” “After all, (external) MIDI devices are different from scsynth. Of course, you can try to hide the differences behind a common interface, but when the abstraction becomes leaky or starts to break down, it’s better to stop and take a step back.” (Taking a step back is exactly what I’m doing here.)

  • “Why not just add latency to noteOn, noteOff etc. methods?” Hmmmm… gut feeling? Also… if noteOn has a latency argument, then write needs it too… then, what about the instance variable? I guess it’s not terrible to say that m.latency represents a default timing parameter, which can be overridden per message. But it smells a little funky to me.

    • Aside: This type of question/criticism might really be motivated by “I like the current way and don’t want it to change too much” – which is understandable, but doesn’t say anything about which programming interface is better (or why one interface might be better for some circumstances).

I could either put this out as a quark, or propose it for the main library.

Comments?

hjh

2 Likes

This seems like a great approach. I like the structure. Would it be possible to use Pattern objects within the play message?

In your example:

Is connect(1) the latency in this case?

So using this, it should be possible to introduce small timing offsets to get the MIDI data in SC to line up with external things which might otherwise not be in sync, right? Would this be possible to adjust while actively sending the MIDI out?

This is an aside, but what are your thoughts of introducing a panic/kill all style message? Or leave it up to the receiver to clean up any potential stuck crud?

Hadn’t thought of that – I’m not convinced it would be a good fit, tbh, since the pattern player needs to be tracked as a separate object.

The multichannel expansion that I already did doesn’t have a time component – they just play at the same time.

No. See MIDIOut | SuperCollider 3.12.2 Help

You could have as many MIDI…Message objects as you need, each with a different latency. (Though tbh you can do the same with MIDIOut currently.)

The question itself reveals a failing in the current interface – it isn’t clear, just from reading the methods, whether latency is applied at message sending time (in which case it’s easy to modulate) or if it’s somehow “baked into” the connection to the OS MIDI layer. I also believed latency was wired in pretty deep, but didn’t see any primitives to update the latency setting, and then found – it was always passed with the message data! So it was always a property of an outgoing message (but no way to pass it as such), and not a property of the connection. Quite misleading.

My approach here is less flexible perhaps than specifying latency with every outgoing message, but it does reflect that latency belongs with the message. So it’s easy in my live coding system, for instance, to create a MIDINoteMessage with latency matching the server (so that notes sound in sync) and a MIDIControlMessage with latency 0 (so that controller knobs respond quickly), both going through the same MIDISender. One good thing about this way is that latency is then “set it and forget it,” even while using the same physical connection.

That should already be there.

hjh

FWIW: I’ve released this now – GitHub - jamshark70/ddwMIDIMessageSend: An output-transport-agnostic MIDI-sending framework

Quarks.install("https://github.com/jamshark70/ddwMIDIMessageSend");

It will be an unfamiliar idiom for many users, but it just might grow on you.

And… oops. I had overlooked that before. It’s in there now: MIDIControlMessage.panic or MIDINoteMessage.panic.

hjh

1 Like