If the Patterns run independently from your live input, the latency can be just as high as needed
The challenge is to start a pattern when I play a note on guitar (depending on the analyzed midinput). When this happens, the clock’s beats is set to 0 and the pattern is played on the clock with quant: 1, ergo immediate execution. Any latency other than nil will add to the latency of the first downbeat of the pattern. With latency = nil, the overall latency, judging from looking at the recorded audio in a sample editor, is around 30 ms of which the audio to midi conversion is responsible for 20-25 ms. With a latency of e.g. 0.03, the latency would double which is undesirable. I am messing around with a hacky way of dealing with it: s.latency = nil for the first downbeat, then 0.1 beats later set the latency to 0.03 and leave it there. The timing will of course be a little strange in the very beginning of the pattern but my initial test seem to indicate that this is less of a perceptual problem than one would think. Once the pattern is going, the latency is much less of a an issue. I noticed that regardless of latency setting (if other than nil) the reported times from the server are less than the latency settings. Eg. with a latency of 0.2 the server time stamps are consistent around of value of approx 0.175 added to each beat. I wonder why it is not closer to 0.2? I know the difference is small, just wondering…
if you want to minimize overall latency, you first need to figure out the lowest possible hardware buffer size that gives a stable audio signal without dropouts.
Yes I had it backwards, thinking higher buffer size would allow less latency. With a buffer size of 64 I get occasional crackles in the audio, so that I might have to up the size to 128. A buffer size of 64 and a latency of 0.02 almost works…I will have to do more testing.