(SOLVED:) Input Timing Accuracy - Is It Possible?

Hi all, new SC user here, recently came over from MAX. SC is really amazing, as you all know, but I’m having one little niggle with it- getting it to respond quickly to user input, such as hitting a pad.

Here is what I’ve read on it:

-----s.sendMsg is faster communication to the server
-----Create a synth, pause it, and un-pause it when the human signal comes in
(somehow this does not seem to make a difference)
-----Use a SytnhDef and not quickplay (obvious)
-----Turn the latency down to zero for human-input events
-----Use SuperNova
-----Read differing accounts on using the internal server

-----Perhaps send the server OSC messages directly from the program processing the pads, MAX in this case

I’m not asking “which one” to use, as I’ll probably have to do everything I can in order to get a pad to play a sample instantaneously. Hopefully this is something others have already banged their heads against the wall about? Any feedback is vastly appreciated.

(also I’m using a pretty slow computer, and old 1.3Ghz dual-core without supernova. could this be a significant factor? I can get good input timing accuracy with MAX, but something in SC is standing in my way)

Thank you again

If you send a message to the server with nil latency (as you mentioned, s.sendMsg), then you’ll hear the result at the beginning of the next audio hardware buffer.

I’m willing to bet that Max cannot do any better than this. Time doesn’t permit me to explain the technical reason now, but there is a technical reason.

hjh

Thank you so much. If I can achieve the perception of being instantaneous, I’m golden.

A little follow-up though: is setting the latency to nil different from setting it to 0?

UPDATE: s.sendMSG made a HUGE difference. Still going to keep looking into other methods to get that margin razor-thin, but s.sendMsg on its own got it to a ‘basically solid’ place from being really horrendous

UPDATE #2: I’m an idiot- I was using the windows default soundcard. I switched it over to ASIO4ALL and lowered the buff size, and that built in 0.096 second delay just vanished. Still glad I optimised it with the s.sendMsg though. Thanks again

Ross

In short: setting latency to 0 is an impossible demand, latency nil means ‘as soon as possible’ (after message has arrived). A more elaborated explanation of server latency can be found here:

http://doc.sccode.org/Guides/ServerTiming.html

A latency of 25 to 35 mlliseconds is realistic. You can quickly make some tests like this: change the latency and see if you get ‘late’ messages (and what delay they show up). You might get two late messages with one event as the release might also be too late.

(latency: 0).play

(latency: 0.01).play

(latency: 0.02).play

(latency: 0.025).play

(latency: 0.03).play
1 Like

The minimum latency value that sclang uses for messaging also depends on the hardware buffer size.

scsynth must calculate the entire hardware buffer at one moment. The control blocks (default 64 samples) within a hardware buffer are not spread out in time over the buffer’s duration, but rather calculated instantaneously.

If you use the latency value to schedule a synth to start in the middle of a hardware buffer, then the message must arrive at scsynth before the start of that hardware buffer. If, for whatever reason, you have to have a larger hardware buffer in some environment, then you’ll see “late” messages with latency values that work perfectly in other environments. (A month or two ago, I was doing a crazy experiment to see if I could run scsynth --> JACK --> https://github.com/Arkq/bluez-alsa but I had to set the hardware buffer very large, and I couldn’t set sclang latency less than 0.18 sec – whereas, in my normal environment, I can go down to 0.05. I ended up abandoning that experiment because disconnecting bluetooth then caused the laptop’s touchpad to die, requiring a reboot! Bad idea all around giggle)

And, as you found, indeed, in Windows, MME drivers add a couple hundred ms of latency without telling client apps about it, so those apps have no way to compensate. ASIO is pretty much required.

hjh

1 Like

Thanks for the detailed responses. For the human-interface stuff i dont mind late messages, the timing is all the matters. One more question though:

why is there not a long delay when I do this and then play a sample:?

s.latency = 1.0;

s.latency applies by default to Events that are played, and JITLib NodeProxy updates. It doesn’t apply by default to server messages from other sources.

hjh