How to know the hardware buffer size

I’m afraid I don’t understand the question.

First off, 1 period / 375 periods per sec is 1/375 sec, not 1/4 sec. A 1/4 second delay needs 12000 samples at 48 kHz.

Then, the hardware buffer size affects the latency with which incoming control messages will be processed (unless they are timestamped). But if a UGen is controlling a hardware device, then it must be by way of the UGen’s output, which maintains full time resolution regardless of hardware buffer size.

Is there perhaps something else unstated about the scenario?

hjh