Server latency only sometimes?

In the code below, ~erra[0] is a percussive sample, and I want to trigger samples directly and in time.

With s.latency set to my usual defalt of 0.2, this plays instantly when I execute it:

~erra[0].play

But using a simple buffer playing synth and an event, there is a noticeable delay:

SynthDef(\bf, {|out = 0 buf = 0 amp = 0.1 freq = 261.6255653006 pan = 0.0 rev = 0 loop = 0 |
	var sig = PlayBuf.ar(2, buf, BufRateScale.kr(buf) * (freq/60.midicps) * ((rev * -2) + 1), startPos: rev *  (BufFrames.kr(buf)-2), doneAction:2, loop:loop);
	sig = Balance2.ar(sig[0],sig[1], pan);
	Out.ar(out, sig * amp)
}).add;

(instrument: \bf, buf: ~erra.at(0), a: 1).play \ plays late

I have to set s.latency much lower to get it to play in time with me executing the code.

Why?

It actually works to specify latency in the event:

(instrument: \bf, buf: ~erra.at(0), a: 1, latency: 0.05).play

(But latency: 0 will give you “late” messages. When the server boots, it prints something about latency. In Linux, it’s “max output latency 46.4 ms” – I’m pretty sure there are similar messages in Mac and Windows too. The latency value in the event should be at least this, no smaller, plus a tiny amount to account for network transmission time.)

In any case, s.latency is not a property of the networking objects. It’s there so that everything in the system using that server can refer to a consistent latency value. But individual operations can ignore or override that value. You shouldn’t expect everything to be latency-ized.

hjh

1 Like

@tedthetrumpet - it is worth mentioning that latency can be set to nil

s.latency = nil

In my current code I am relying on patterns being played immediately (or as fast as possible) on a MIDI note on message and setting latency to nil has proven the best way for me. Even though this can theoretically lead to timing issues, in my experience and on my setup (OSX on a M1 laptop) a latency of nil works better than say a latency of 50 ms and will mean no late error messages. I am still able to schedule patterns on a TempoClock and the timing issues are negligible. I most admit that I lack a deeper understanding of the precise implications of having latency set to nil but as I said, it works for me.

just to be clear Synths execute ASAP, Events obey s.latency.

you can force Synths to use latency by wrapping in s.bind{}

This can be useful when you have some event or pattern based things you need to synch up with Synths

1 Like

Server messages or bundles sent without a timestamp will be quantized to the hardware block boundary, so the timing impact depends on the hardware buffer size. If your hardware buffer is 256 samples, at 44.1 kHz you would get up to 5 ms timing jitter, which may be acceptable. 1024 samples, it’s about 21 ms which is not acceptable for anything rhythmic (though it would be ok for slow moving soundscapes). That is, some messages would sound pretty much instantaneously but if a message arrived just after the HW buffer boundary, it would be delayed almost 21 ms to the next one.

With a timestamp, synths are quantized to the control block, as long as the message arrives before the hardware buffer boundary. With default settings, that’s about 1.45 ms jitter, which is ok for almost everything.

hjh

1 Like

@jamshark70. When using several patterns inside a Ppar with s.latency = nil, can I be sure that the internal timing between the patterns is preserved (this is my understanding/experience)? So if I understand this correctly, I cannot know exactly when the patterns will start playing but I can be sure they are in sync when inside a Ppar, if separate patterns (not inside a Ppar) are played quantized to the same beat there might be/will be timing issues.

From the description you gave above I would assume a relatively short s.latency value would work better than nil, however I cannot really go below 50 ms without getting a serious amount of late messages, even sometimes to the point of crashing SC. Is this an expected behavior?

Server messaging latency affects one thing and one thing only: the timing of the execution of OSC bundles sent to the server. It’s a one-directional communication, from language to server, affecting only the target (the server).

Pattern timing is 100% controlled by a language-side clock. This clock is not the target of the one-directional communication, therefore s.latency cannot affect clock timing.

This isn’t the right inference – messaging latency is totally irrelevant to this.

What you cannot be sure of, with nil latency, is that the timing of synths from one moment to the next will be consistent.

The source of the messages will be entirely consistent but the timing on the receiving side (the server) will be jittery.

hjh

Thanks for clarifying.