On mac, at least, you should see this line when you boot:
SC_AudioDriver: sample rate = 48000.000000, driver's block size = 2048
This is read back from the device after we attempt to set the hardwareBufferSize value from ServerOptions, so this should reflect the actual hardware buffer size. I’m not totally clear on MacOS how this relates to latency, since afaict you can have different audio clients running with different buffer sizes at a given time. My assumption that the observable latency for an audio application is a function of it’s OWN buffer size, and not e.g. the “worst case” buffer size for all applications using that device, but I don’t know the deeper details here.
Supernova uses PortAudio, which has it’s own latency measurements that are posted when the server is launched, but I don’t know how these are calculated:
Latency (in/out): 0.158 / 0.045 sec
But sadly there’s no way to programmatically determine the latency, apart from maybe launching your own scsynth process and scraping the stdout for the above strings…