I am little surprised about the results of running this test code:
(
SynthDef(\onsets, {
var time = Sweep.ar(Impulse.ar(0));
SendReply.ar(Impulse.ar(1), '/reply', time);
}).add;
);
(
~initTime = Main.elapsedTime;
o = OSCFunc({ |msg, timeStamp|
var time = timeStamp - ~initTime;
var time2 = msg[3];
("oscFuncTime" + time).postln;
("oscFuncTime deviation in ms" + ((time - time.round(1)) * 1000).round(0.001)).postln;
("SynthDefTime" + time2).postln;
("SynthDefTime deviation in ms" + ((time2 - time2.round(1)) * 1000).round(0.001)).postln;
"".postln;
},'/reply', s.addr);
x = Synth(\onsets);
)
Here is what I found:
-
“SynthDefTime deviation in ms” stemming from Sweep.ar seems to be running 1 sample late, is this expected? I am not concerned with sample accuracy at this point, just wondering.
-
“oscFuncTime deviation in ms” stemming from the OSCFunc timestamp is producing both positive (expected) and negative values (unexpected). The range of deviations is proportional to the the current samplerate. With a samplerate of 44100 I get values varying by upwards of 10 ms, eg in the -2 to 10 ms range. The deviations seems to have some periodicity (see the post window when running the code). If I double the samplerate, the deviations are cut in half.
Can other users verify the behaviour described above?
How can the negative values be explained, how can the timestamp happen before the trigger?
Why are the deviations so relatively large and why do they seem periodic?
The testing leads me to the conclusion that you cannot rely on the time stamp of OSCFunc for precise timing applications, rather one has to get the time from the SynthDef using Sweep.ar or similar Ugens.
Any thoughts?