Understanding latency, buffer settings, delay compensation and dialing your system to be accurate

Why is it Cubase can auto-compensate for AD/DA and buffer latency automatically when using External-FX plugin, but can’t with just the normal regular record inputs?

You need to explain the problem you’re having as you may be setup incorrectly. Is it possible that Cubase IS auto-compensating but you’re not wanting it to… Or you’re referring to record offset time, for example?

i.e. if i sync an external synth running an arp/sequencer and record it via an input it’s perfectly in sync, but that very much depends on whether i’m monitoring dry or not - if i’m monitoring dry/through hardware then i don’t want any latency compensation applied, for example, as that will be applied on playback based on any FX chain i place on it.

Skijumptoes, I’m not sure you’re presenting this correctly. If you monitor via hardware as you record/add a part to an existing track, Cubase needs to take into account the latency of the track playing back as it leaves the DAW - typically half of the entire latency of the round trip latency listed in your ASIO Driver info. In other words, if you have 10ms of latency listed, there will be a 5ms delay as the new track is recorded that Cubase needs to compensate for,

In other words, if you turn this feature off (the Cubase delay compensation feature) as you record a new track (monitoring via external hardware) then your new part will be 5ms late vs. the original track on playback.

To answer the OP’s question, Cubase does not have the capability to send a playback track (in the example above) 5ms into the future. Think about it. :sunglasses:

Let me get back to you, I’m doing tests, I think perhaps I made a mistake earlier.

1 Like

To be honest, i’ve not spent time needing to study how Cubase works - as it seems very tight to me without further work. And yeah, i’m the master at not presenting things correctly, i was pretty tired earlier lol. :slight_smile:

But based on other DAWs where i have had problems (Ableton for example) then, if you’ve got software monitoring enabled in the DAW and on the track you’re recording into - then the latency of plugins etc. will be applied to your recording input so that you record ‘what you hear’ basically… i.e. if you have a synth arp being software monitored and the session had 20ms of delay from plugins, and you have that arp in time with the project loop it should playback the same. Likewise, if you Direct Monitor (i.e. disable software monitoring) then different compensation would exist, but should playback fine.

The problem comes, on some DAWs, where people enable software monitoring and then mute the track, hit record and hardware monitor instead. But, as i’ve said, i don’t know how Cubase does things yet - and hopefully never ‘need’ to. :slight_smile: Logic had a slight issue with Busses and PDC too.

But the question remains… what is the specific question being asked here. Cause i’ve seen so many people complain about PDC ‘not’ working across various DAWs, when it is just doing it’s job and they don’t understand its subtitles.

Okay, so audio recording seems to always be in sync with my set up which is good. DA->AD round trip is always sample accurate regardless of buffer changes, regardless of whether ASIO guard is on, and regardless if there are latency plugins in the session and regardless of Constrain Delay Compensation.

Where I’m having latency problems is in two scenarios of MIDI and Audio

1.) If I am sending gridded MIDI out to a Synth and recording its audio, when the audio comes back in, it’s between 565 and 588 samples late (88.2 khz).

2.) If I am playing an analog synth recording both its audio AND MIDI, the MIDI is around 400 samples late (88.2 khz)


So I’m now thinking this is why I need to be using External Instruments as it gives that delay setting per instrument? Is there another way of compensating this? I’m using a MOTU MIDI XT to handle MIDI.

I’ve only used my MIDI gear via the external instrument function, but of course you need an ADAT/Interface input per instrument for that to work effectively. So if you have that in place, it’s a no brainer as there’s a few other benefits too - would be interesting to see if it helps you out.

I’m not sure at what point Cubase applies correction to the recorded audio, in other DAWs i’ve used, it was related to whether the monitor button was enabled on the track or not.

How are you monitoring the synth in scenario 1 - is it through Cubase, or Direct?

There’s so many configurations it’s easy to get caught out, but i upgraded to Cubase Pro only about 6 months ago and i’ve found it extremely good in regards to DPC and sync - other than when the audio cuts out when you add a plugin/send etc. (grr!). Hopefully someone can give you some more definitive/experienced advice as i’m all ears on this subject, for sure.

I almost always monitor direct using my AD/DA Lynx Aurora mixer, though I’m thinking about starting to use the Cubase control room/cue system more.

Hoping you get a decent reply mate, as i’ve been down that rabbits hole before… I got 5 synths, 2 drum machines and 3 FX units in a rack… And touch wood, Cubase has been brilliant accommodating any external gear for me.

In other DAWs i’ve had to break it all down and investigate just as you’re doing now - and found myself doing more IT work than music, which sucks as it leads you to question everything! One thing i’ve learnt over the years is that most software is based around matching up ‘what you hear’ at the point of recording, as even if you play a guitar live with a plugin giving you latency the human brain allows for that and we subconsciously play a little early to compensate. So even though testing it doesn’t make always make sense - using it does.

The one question i’ve go on mind that may be affecting you, is if you’ve got a standalone synth with onboard arp/sequencer running, and you sync it to the track playing manually - How does cubase know if you’re direct monitoring or software monitoring when you hit that record button, so it aligns on playback? Does the ‘monitor’ button make a difference for example? If there’s 20ms+ latent plugins in that project then the difference could be huge.

Hmm, I’ll have to think about this. I’m trying to understand the question first.


Standalone Synth meaning, software, or do you mean hardware?

Hardware, in my case let’s say my Yamaha Montage… So i have a nice arp that i’ve got setup and it’s playing at 120 BPM… Cubase is playing at 120 BPM and i sync it live as a DJ would so they’re playing in unison. Great, happy with that…

I then hit the record button and record that loop in.

BUT, imagine this… If my project has 20ms latency then depending if i’ve synced up ‘by ear’ using direct monitoring vs software monitoring in Cubase, then that Arp will be set 20ms different due to the source i was using to setup the sync. How Cubase knows that i’m software or hardware monitoring is really important to establish in that instance.

Well, I would probably try to use some sort of MIDI sync and or MMC in that case :smiley: (my next test)

But either way, Cubase should always be compensating the record ins - this is what my tests so far are showing… I’m trying to intentionally get the audio off sync from the grid and can’t.

So essentially you are listening to delayed playback and you are setting your ARP to that delayed BPM click. But Cubase KNOWS what you are listening to is delayed, whether you are a drummer playing to click, or setting your ARP to match by ear, so it should be accounting for that on record tracks.

You can see in the animated gif that at the Analog-Digital input Channel 16 (the return of External FX plugin) that is nearly half-beat off, yet is still aligned in Cubase on the record track.

This is all with a number of latency plugins on, the highest being 1154 on the main bus. As well as the highest latency settings I could muster in Studio Setup:

I guess the Extneral FX Plugin is always running it’s background latency potential max which is 100ms? Because in the above test, I had it set to 0.

So it was going out Channel 16, and in Channel 16 which is what the external fx plugins ins/outs were set to.

If I use my patchbay to steal that signal and input it on a different channel such as channel 1, then the recording will be out of sync.

I believe the external FX Plugin Delay value, is for if you are using digital outboard which has its own AD/DA and processing latency, you use the measuring pulse or manually set it… but… either way… it appears the protocol with External FX plugin, is that it is always running its max delay value of 100ms in the background, and then adjusts for whatever value you set… which in my case is zero because I’m not using any digital outboard.

So my question in the other thread regarding specifically External FX plugin, was, if it’s zero, it’s thus no different than a manual-send return and doesn’t need to be turned off by Constrain Latency Compensation - but based on how they designed it, it does turn off regardless of zero.

Ok, i can see how you come to that conclusion, and good research.

BUT! …wink :slight_smile:

The plugin may be used inline with effects AFTER it, so it IS important for it to have latency, and the input/output delays to be applied at the point where it sits in each tracks Inserts order - Also, it WILL add to the latency of a live recording if you’re software monitoring - so it make sense to be disabled when session is constrained.

You’re maybe looking at the global constrain setting and applying it to one scenario for single FX sends and returns… When there’s many other scenarios that it does have to fit into.

Remember the plugins goal is to turn any external FX unit into an assignable insert, and so the plugin has to be standardised and react to elements such as constrain just as any other plugin you could assign.

For reference, i just tried a project with the external FX plugin running and constrained latency and moved the threshold up slightly and ‘applying’ each time, until it no longer was disabled - the value i reached was identical to my ASIO Driver In and Out latency combined (8.9ms) at that point it stayed enabled. To me, that makes perfect sense.

The 0 that you keep refering to is just an offset too, as i’ve said before - and yes that offset is if your external gear has a latency and needs further compensation - i tested all mine with a ping test and they were close enough for me to leave it 0. The plugin will not be running at 100ms by default, even if offset is set to 0 though - it would kill your projects live latency otherwise.

Yeah some of what you mentioned here was already fizzing in my head but I still don’t quite get why.

I guess part of the reason why I’m not getting it, because I don’t do software monitoring using Cubase, it’s straight off my AD/DA interface.

ie, all of this becomes more relevant and more of an issue IF you are doing this:
D_Playback Beds Output → Artist Headphones
A____________________<-Overdub Recording Input
W_Ovrdub Recording Monitor → Artist Headphones

The latency delay issues arise between and after the ‘A’ and W’ where the artist needs to hear their overdub, which isn’t an issue if you can monitor off your interface because the signal is not hitting the DAWs processing latency yet.

However, if you are monitoring through Cubase, any plugins you have inserted with latency will have to be added to the sync and thus by the time that is done, the artist is hearing a delay of themselves.

I got all that. How the External FX protocol fits into all this, I still don’t get, because it is again, technically in theory, no different than my manual send/return.

So does the External FX plugin protocol simply presume, it is being used with other plugins that have latency? (we’ve got the ‘0’ thing out the way, I get that has to do with external digitalHW latency not counted by Cubase). Because again, my manual send-return still has to account the in/out latency and again, is not disabled when ‘Constrain Delay Compensation’ is activated.

Maybe I’m confused between “I have something technically wrong here thought wise about the topology of this” and “This is just a design quirk/borderline bug with External FX I just have to accept” - it’s really hard to tell sometimes :laughing:

Wait wait wait, I think I get it.

So, in this case, I - am - the artist.

I want to be able to monitor the External FX in time with the rest of my project and have it be latency and even phase coherent.


Is this it?

It’s hard to say what the source of your confusion is. But, maybe you’re overlooking one or more of these points:

  • External FX always have latency.
  • CDC only disables plugins. It doesn’t apply any compensation.
  • CDC treats external FX like any other plugin.
  • Cubase cannot play audio in the future to remove all consequences of latency when recording.

Yeah no worries I already got it mate

External FX have latency because you need to monitor everything in time with your project. The method of manually sending out an output, and returning on a separate track works and will be recorded in sync, but monitoring it would be out of sync.

Now, I could manual send/return and monitor the return off my AD/DA but that proves another, albeit smaller sync issue because I’m monitoring my DAW playback one trip off my AD/DA, whereas, the to-be-processed signal is going off the DA and back onto the AD where it will have a conversion latency. In order to get this sync’d. I’d have to patch my mains-signal out my DA and back into my AD and then back out my speakers. Which sort of creates work and takes up at least an extra two channels of my converter.

CDC disabled plugins to defeat or minimize the latency problem of a signal having to take a double trip through the DAW for the purpose of monitoring through the DAW if that is needed. Is also a handy way to relieve CPU/Ram if it’s erratic when recording.

Sort of, but that’s not the way I would put it.

Monitored audio is never in sync or out of sync. Synchronization during recording is determined by how you play. However, monitored audio may have latency, which can make it difficult to play in sync. And it can cause sync problems later, during playback. That latency can be minimized by various techniques, including using small audio buffers. CDC can be used to easily disable latent plugins, which can also help to reduce this latency, as long as you don’t need those effects during recording.

It might help if you describe what you’re trying to do. I gather you need to use an external effect during recording? What effect are you using it for? You can’t use an in-the-box effect instead, at least during recording? What is the source of your audio? Are you using a virtual instrument?