I’m trying to make sense of how ASIO Guard manages its buffer settings - in my case under OSX.
ASIO guard uses two buffers, one for record enabled tracks (this is the buffer that is set by the user - eg 128 samples), and another for all other tracks that are not record enabled. It appears that this playback buffer can not be manually set by the user, but instead is either fixed or possibly dynamically changed according to the resources required by the project.
In the settings menu under VST Audio System, the latency of the user-set (recording) buffer is shown as an input and output latency in ms. The total of these two figures is the round-trip latency. In the attached screen shot, the round-trip latency is 8.322ms (total of input and output).
Below that is the ASIO-guard latency (this would be output only, given that ASIO-guard’s buffer is for playback tracks).
For a user buffer of 128 samples, this ASIO-guard latency is 11.610ms. With a bit of trial and error, I’ve worked out that this is equivalent to a buffer of about 400ish samples.
When the user buffer is increased, so does the ASIO-guard buffer. Interestingly, when any buffer below 128 samples is selected, the ASIO-guard latency remains at 11.610ms.
My question is… is it possible to increase the ASIO-guard buffer manually even when the user buffer is low? For example, would it be possible to have an ASIO-guard latency of 49.38ms (the equivalent of 2048 samples) while still running at a very low user buffer?
Or conversely, does Cubase increase this ASIO-guard buffer as required, when running at a low user buffer, as the project demands it?
I would love some further information on this aspect of ASIO-guard from one of the Steinberg moderators. It’s a great feature and will definitely have a huge impact on performance, but I’d like to understand the mechanism behind it so I can get a feel for how much of a performance improvement might be delivered.
Cheers and thanks in advance,