Server clock vs. Language clock

I have been trying to test the server clock against a language clock. The code below always, regardless of chosen rates (here 1, 5.66 and 13] show a periodicity in the offsets between server time and language time as seen of the plot. For a rate = 1, the periodicity is always 4 values = 4 seconds, but the other rates produces different periodicities. This is regardless of wether I run the code off of the internal sound card on the M1 or my Apollo Solo.

Can someone help me understand what is going on and what is causing the different periodicities?

(
SynthDef(\time, {
	var trig = Impulse.ar(\rate.kr(1));
	var time = Sweep.ar(Impulse.ar(0));
	SendReply.ar(trig, '/reply', [time, \id.kr(0)])
}).add;
)

(
var rates = [1, 5.66, 13]; 
l = { List.new }!rates.size;
t = TempoClock(1);
o = OSCdef(\o, {|msg|
	var time = t.beats;
	l[msg[4]].add(time - msg[3]);
	}, '/reply');
x = rates.collect{|rate, i| Synth(\time, [rate: rate, id: i]) };
{ 
	12.wait;
	x.do{|n|n.free};
	{ l.plot }.defer;
}.fork
)

The above code produces this plot:

On the very top of my wishlist for SC4 is synchronized server and language clocks. Would that be possible or are there technical obstacles making this impossible?

I’ve observed in Linux (and I think also in Windows) a roughly periodic timing jitter in OSC message receipt. That is, you might not be seeing language clock jitter – you might be seeing discrepancies in the timing of message transmission and/or receipt. The internal server might help because it doesn’t go through UDP, but I didn’t try that.

hjh

The internal server might help because it doesn’t go through UDP, but I didn’t try that.

What is the internal server, is it not just the default server in s?

It’s a server that runs within the sclang process (loaded like a library rather than a fully separate process). Because it’s in the same process, messages can be sent by straightforward calls instead of going through a network protocol. (It doesn’t even open a UDP port, which is why the IDE can’t report its status.)

Server.default = Server.internal;

hjh

I tried running the above line, rebooting the server and running the initial code again. The result was pretty much the same, except for one ‘rogue value’ in the middle plot:

Here on Windows the resulting plot looks pretty random (as expected):

Since you are on macOS, I guess what you are seeing is the effect of clock drift. Sclang runs on the system clock, but the server runs on the audio clock. These two clocks do not run at the same speed. To enable precise OSC bundle scheduling, the server must somehow estimate the current system time for every control period.

On macOS, scsynth periodically resyncs the time very 20 seconds (see syncOSCOffsetWithTimeOfDay in server/scsynth/SC_CoreAudio.cpp). This means that the clocks are very gradually drifting apart for 20 seconds before being readjusted.

On Windows and Linux, however, the OSC time is continuously estimated with a time DLL filter. This means that there is no gradual drift.

You can try running the same code on Supernova, which uses a time DLL filter on all platforms (unless, of course, useSystemClock is set to false). Do you still get similar results or do they look more like mine?

What I find striking in your example, though, is the exact period of 4 seconds!

1 Like

What I find striking in your example, though, is the exact period of 4 seconds!

Yes, and this also is the case if I let it run for longer periods of time.

Here I am testing rates which are all related: [1, 2, 4, 8, 16].

I have never used Supernova before. What is the easiest way of testing this in Supernova?

There’s a long history of server-language timing issue discussions, some of them going into elusive details. Maybe you find something helpful in these old threads (also see the update from May '21)

3 Likes
Server.supernova;
s.boot;

Would it be technically possible to ever have synchronized server-language clocks, maybe in SC4?

Thanks for the heads up, I will look into these old threads.

Would it be technically possible to ever have synchronized server-language clocks, maybe in SC4?

Yes, I think it would be possible. See the following discussion: Keeping sclang and scsynth in hard sync - #7 by Spacechild1

The principles outlined in the discussion could, of course, be applied to any new computer music system. (Personally, I wouldn’t bet on “SC4” to ever happen.)

1 Like

I just tested the code with Supernova - same result basically. There is still a periodicity of exactly 4 seconds for rate = 1, also when tested over er period of 60 seconds. The other rates also show a similar periodicity but with occasional outliers.

Turns out that macOS vs. Windows was a red herring.

I have just tested gain, but with an ASIO driver and very small block size (64 samples). Now I also get the same periodicity:

To be clear: it is expected that there is some difference between language and server time. The only thing that does surprise me is that ominous 4 second period :face_with_monocle:

To be clear: it is expected that there is some difference between language and server time. The only thing that does surprise me is that ominous 4 second period

Yes, but isn’t it also surprising that any rate shows periodicity and that periodicities for different rates don’t align?

All plots do seem to have the same period of approx. 4 seconds. They only differ in their phase, which is not surprising as the Synths are just created via messages and not scheduled as bundles.

More specifically:

x = rates.collect{|rate, i| Synth(\time, [rate: rate, id: i]) };

This does not guarantee that the 3 Synths are started at the same time, but the following would:

s.bind {
    x = rates.collect{|rate, i| Synth(\time, [rate: rate, id: i]) };
};

All plots do seem to have the same period of approx. 4 seconds. They only differ in their phase, which is not surprising as the Synths are just created via messages and not scheduled as bundles.

Yes that was an oversight on my part. I just tested the opposite - sending triggers from a pattern:

(
s.latency = 0.2;
Pdef(\test).clear;
t = TempoClock(1);
x = Synth(\serverTime);
Pdef(\test,
	Pbind(
		\type, \set,
		\id, x,
		\args, #[\serverTime],
		\serverTime, Pseq([1], inf),
		\dur, Pseq([1], inf),
)).play(t, quant: 1);

l = List.new;
o = OSCdef(\o, {|msg|
	var time = t.beats;
	l.add(msg[3].postln - msg[3].round); 
	}, '/reply');
)

(
Pdef(\test).clear;
o.free;
l.array.plot
)

The time stamps from the server are really steady so it seems that there are no timing issues with language-to-server communication, only with server-to-language.

Again: I have observed considerable timing inaccuracy in sclang, receiving UDP packets. I don’t think you can gather any accurate data about server-client sync by measuring the timing of messages coming into sclang, because you have no guarantee of good timing on the incoming messages.

hjh

1 Like

so it seems that there are no timing issues with language-to-server communication

The Event stream player schedules its messages in advance as OSC bundles, exactly to enable precise timing.

only with server-to-language.

Which is expected because these are sent as plain messages.

It would be an interesting idea to support timestamped OSC bundles, though. In fact, I’m planning a plugin API extension – for entirely unrelated reasons – that would allow sending arbitrary OSC messages back to the Client; this would also include bundles! This could be quite useful for sending data back to client without losing all the timing information.

2 Likes