It’s great that latency added by fx plugins/vsts per channel is now detailed per channel and easy to see, but what about the TOTAL project latency (i.e.total latency of ALL plugins used in project - total latency, not just channel totals)? No hack/way of seeing this, is there?
(NB. I am not talking about the audio card/interface latency caused by h/w buffer, just latency caused by VST plugins. )
Also what about VST instruments? Seems like a silly question, but is there latency added by VST instruments themselves? Or is it just FX VSTs? (are all instrument VSTs effectively zero latency? - but just need more realtime cpu ?)
No offense here, but the OP is asking for a way to show the total VST latency of the project.
The Output channel only shows its own latency, just like any other channel.
Afaik this is not possible, Cubase doesn’t show the total combined latency anywhere.
And believe me, I wish it was!
Also, instruments may have their own latency internally (for example legato patches for strings) but this isn’t reported to Cubase. If it ever was, when the project has dozens of instances, you would have to wait several seconds before hearing the sound when pressing a key. Instead each instrument manage their latency individually, in which case for those, you have to play the notes a bit early.
The only project latency involved with instruments is the ASIO latency + the one added by FX plugins.
Thanks Martin and Louis_R.
As Louis_R correctly points out, the Stereo Out channel only shows the latency for that channel, there is no COMBINED project latency shown anywhere. That would be really helpful and I would have thought ‘fairly easy’ for Steinberg to implement. Maybe at the bottom of the control room panel we could get a “TOTAL LATENCY + CPU/MEMORY USAGE” display which would initially show totals (combined latency of ALL plugins loaded + Total CPU % + Memory being used). Perhaps also a ‘details’ button which when clicked would bring up a neat window showing a comprehensive breakdown/analysis of everything going on in the project – each and every loaded VST instrument and effect used within Cubase together – so: latency per VST, maybe also per channel (as we already have but all unified in the same place) PLUS VST instrument CPU + Memory + latency (if there is added latency by any instruments) + buffer and TOTAL of each. So we know EXACTLY where we are before reaching for that hardware audio buffer control.
So just to clarify: VST instruments definitely never add (unreported) latency? This does seem to be the case but I am curious to know if any do… I find that as I add more and more VST instruments, there is no increase in overall latency, just gradually higher CPU & memory use until the inevitable crackles as the CPU spikes and at that stage the hw buffer must be increased.
Have you ever seen VST Instruments that add 500 ms latency to the project ? The only latency you can observe with instruments is the ASIO latency, the one induced by the buffer size of your audio device + the one added by Insert FX.
VST Instruments never report any latency to the DAW. If the latency is too high because we have too many plugins, then we enable Constrain Delay Compensation to play in “real-time”.
Instruments can obviously have their own processing latency internally, but we’re not supposed to know this.
For example when playing legato patches there is latency inside of the instrument, because it doesn’t know in advance what note you will play next, in order to trigger the right portamento sample. This is never ever reported to the DAW, you can have 20 instruments like this and each of them have their own latency, individually.
It doesn’t stack up unlike the latency added by inserts FX. Instruments are not part of the signal processing chain, they are generators.
If I could see individual cpu consumption stat on cubase project, that would be handy! I am now actually speculating this with Channel latency on mixconsole, so while I am mixing I can decide when to print it and move on, because sometimes my mac mini intel 2018 is struggling with some heavyweight VSTs.
In the same league, Camel Audio Alchemy reports 64 samples of latency, but BFD3 and Emulator X3 both report 0. All three of them are samples based VST 2.4 VSTis…
The only thing that seems more or less certain (well, AFAICS with the limited VSTis set that I have), a given instrument not samples based is always reporting 0 (i.e. Sylenth1, Loomer String). After this…
And how these latencies are adding to the overall project one ? A mystery, for me, but I gave up since a long time to understand how all the ‘latency’ related stuff truely works, so I no longer really care. As long as I can play any instrument in ‘real time’, with an imperceptible delay and a reasonable ASIO load…
It’s got nothing to do with samples or no samples, I have algorithmic synth VSTIs that have a few samples of latency. It probably just depends on what is needed for the synthesis, or maybe the whole VSTi runs at a higher samples are, and the latency is added by the (linear phase) downsampling filter.
I have tested with VST 3 Arturia’s Analog Lab V. It says latency is 96.
I have created 70 instances which equals to 6720 samples of latency. With all the 70 tracks monitored, there is no perceptible latency. When pressing Play there is no latency either. So this means the latency is managed per instrument individually. In any case 96 samples of latency at 96 kHz is 1 ms, so whenever I play these instruments in real time with my keyboard, they will output the sound after 4.5 ms instead of the usual 3.5 ms with my current buffer settings. Nothing that I could ever notice.
My guess is that this also happens upon playback, but it only takes the longest latency into account. If you have various instruments with different latency, the smaller ones are processed under the longest one, with a delay so that every track plays in sync.
Example : Instrument A = 96, Instrument B = 64. When pressing Play, a delay of 96 samples is heard, and Instrument B will only start processing once 32 samples have elapsed after pressing Play, so that both instruments finish performing their delays at the same time.
So as @fese said, this latency is perhaps needed for the instrument to process everything properly, but it never adds up like VST Effects.
So, what I see from your tests (I admit that I haven’t tested this as thoroughly as you, as it’s not a real issue, on my end), the latencies of different instruments (or instances of them) do not add up, which is rather reassuring.
I usually track with a 128 samples setting in the Fireface UCX settings window. It’s a good compromise for both tracking and playback, with the added help of the ASIO Guard feature which I learned to appreciate, recently.
I have my whole setup at 44.1 kHz. What I have always wondered about (among others) is this one : let’s say that I set the Buffer Setting of my UCX at 64 samples (never felt the need to do so - it’s a theoretical situation). If a VSTi exceeds this threshold, will the overall latency, while playing/tracking it, be the latency reported by the involved VSTi, or be doubled as a second buffer will be needed to process the audio rendering of it ?
To be honest I never thought about instrument latency before looking into this thread, I always thought there was none and never even noticed the latency information in the plugin manager
There is no threshold of any kind, instrument latency is introduced just like with effects, prior to the buffer. Some effects out there can have a latency of more than half a second so they basically explode the buffer value even if you set it to is highest. I don’t think instruments would play a role in that.
The exact order is :
Instrument (not cumulative) → Effects (cumulative) → Buffer (ASIO Guard → Real Time).
Instruments play in parallel when you have multiple in the project, because they are generators just like audio tracks. In the other hand the generated audio has to pass in serial through the effect or whatever routing. The latency from the effects then adds up, this cannot be otherwise.
The buffer is always last. When the instrument is monitored it bypasses ASIO Guard. When not monitored it goes through ASIO Guard + Real Time, in this order. Or maybe the other way around, the performance meter shows Real Time first, so I have no clue. Still it makes sense to me that AG is first since it is a Cubase option, and Real Time is last because it’s the final buffer set by the audio device. Whatever that does not change anything for the end user.