Hi,
I’m trying to make a looper, and I wanted to see if it’s possible to seamlessly loop over a buffer. So I’m playing the following code, while recording the audio output in Audacity in order to analyze it.
b = Buffer.alloc(s, s.sampleRate, 2);
(
b.set(0, 1);
b.set(s.sampleRate * 2 - 1, 1);
SynthDef(\play_buffer,{ arg out = 0, bufnum;
Out.ar( out,
PlayBuf.ar(2, bufnum, BufRateScale.kr(bufnum))
)
}).send(s);
t = Task({
loop {
s.makeBundle(s.latency, {
Synth(\play_buffer, [\bufnum, b]);
});
1.wait;
}
}).play;
)
This somehow completely saturates the second output. Looks like the last sample of a buffer leaks, is it a bug?
So let’s instead mark the second last sample:
b.set(s.sampleRate * 2 - 3, 1);
Both canals are ticking fine now, but Audacity shows the second output is late by two samples. Just to check:
b.set(s.sampleRate * 2 - 7, 1);
instead, and now both canals match.
So there’s a timing error of 3 samples, about 7e-5 seconds. A float variable with a value around 1 can handle this precision by far. I would have expected that the logical time of the server, coupled with a time tagged OSC bundle, would result in a sample-perfect timing, if the buffer length is not too big.
Where does this issue come from? Is it a bug, or should I take it into account?