Interfaces: control gestures

I’m writing some code to handle common control gestures, like tapping on a button, double-tapping, long/short presses and such. I’m using it for a midi device, but I think it could be easily used with guis as well…
Is there anything out there already? Or any recommended approach? I’m sure some people must have been doing this already!

1 Like

Qt has gestures but I think they’re not exposed in SC.

I think in SC we’re limited to the “lowest common denominator” which nowadays is what X11 supports… which is not much and it’s a hack using xdotool and libinput-gestures plus fusuma or maybe touchegg. (See this Gnome blog also get an idea why supporting gestures cross platform is complicated…)

The chances of big-corpo distro like Fedora enabling those while the can get sued by Apple over patents is pretty low.

1 Like

Interesting, especially for touchscreen devices, but just to clarify, my most important use case would be for midi interfaces. Example: distinguish between a quick press of a button that is immediately released (tap), a double-tap, a long press, and so on, so that they can all trigger different actions. Like play on tap, delete loop on long press, do something else on double tap…

I don’t think Qt is suited to work this way with midi devices. Ideally, in the long run, a gesture recognition thing for Modality Toolkit would be super nice. I imagine you could record sequences of midi messages as gestures, train with some variations, and build a machine learning model that recognizes them.
But before that, just something simpler that, for instance, processes midi messages and emits different events (which is more or less what I’m doing)

1 Like

Aaahh, MIDI gestures. That’s a new term… But it’s actually more related to touch screens nowadays than you might think. ROLI and all that. ROLI does have/follow a MPE spec, the basic idea of it being:

Wherever possible, every sounding note is assigned its own MIDI Channel for the lifetime of that note. This allows Control Change and Pitch Bend messages to be addressed uniquely to that note. […] Aftertouch is sent using the Channel Pressure message. To preserve compatibility with existing MIDI devices, Polyphonic Key Pressure may be used with notes on the Master Channel, but not on other Channels.

But I don’t think there is anymore standardization than that, i.e. translating the pressure, velocity, length or press and what not into something else is entirely implementation dependent.

And MPE is basically getting a do-over in MIDI 2.0

MIDI 2.0 Per Note Controller Volume: https://www.youtube.com/watch?v=okrZYm5OJzo
MIDI 2.0 Per Note Pitch Bend: https://www.youtube.com/watch?v=x2QxFnsKWMQ

although I haven’t looked at the technical protocol details.

1 Like

ROLI aside (seems tangential to the examples given):

  • You can identify a long press only if the controller gives you both press and release messages. MIDI notes will do that. CCs probably don’t (though my Nanokontrol has push buttons that send non-zero for press and zero for release). So, check that first.

  • Otherwise, you’ll need to wait a bit after message receipt to see what happens next. So you’ll lose instant response. In practice, if the double tap delay is short, it’s fine.

Here’s an example of a double-tap responder that I used to use: https://github.com/jamshark70/ddwChucklib/blob/master/Chucking.sc#L2088

hjh

1 Like

Thanks James, I knew you probably had something there :slight_smile:
I’m not sure how much interest there will be about this… mine for example is quickly going down ( nice to have the option, not so nice to overload a controller with actions at least the way I’m doing it ), but I’ll share my implementation as soon as I write some docs for it

1 Like