Input and Converter Latency Compensation Questions

When I record, I use direct monitoring. So, I pretty much hear recorded audio as it happens, without latency. However, when I look at the “Input Latency” of my Asio Device, I see what looks like a measurement of how late the audio is actually being recorded. When recording, does Cubase automatically compensate for this “Input Latency”? Do I need to manually set the “Record Offset” to compensate for the “Input Latency”?

Also, I just found out the latency of my A/D converter at my preferred sample and bit rate, and I would like to compensate for it’s small amount of latency by adjusting the “Record Offset” settings. Is anybody else here doing this? I’m just striving to have the audio be recorded exactly how it was played, with respect to timing.

there’s no need to adjust, when the driver reports the correct conversion latency

Yes, but the driver doesn’t know, let alone report the latency of the ADC and possible safety buffers (although I think it’s marginal).
To wagzy, yes of course you could use the offset to compensate for that latency.

of course the driver knows.

the engineers of the hardware know the conv latency of the parts used, the driver programmers know how conv latency applies with different sample rates, and usually this latency then is added to the buffer latency the driver reports etc.

If not the hardware/driver manufactorer, who else knows the conv latency? :question:

It’s a well-known fact, roundtrip latency is different from the reported ASIO buffer size latency by the driver. Most of the non-reported latency is added by USB/FW/PCI-E bus clocks (safety buffer) and ADC and DAC.
But again, for the common user, 9 out of 10 times it’s negligible.

the OP didnt ask for roundtrip latency, but for "…manually set the “Record Offset” to compensate for the “Input Latency…”

and according to the ASIO specs, the driver should report the input latency introduced by ADC in order to have the DAW offset the recording. There are interfaces however with drivers which don’t comply to the ASIO specs, which is the reason cubase offers a manual record offset…

Maybe this will help clear things up. I am not using an “audio interface”. I am using A/D converters with an AES Soundcard. So, I am talking about two different latencies, here. I know the latency of my A/D converters, but I don’t know if Cubase automatically compensates for the “Input Latency” reported to the program.

TabSel, are you saying that Cubase will automatically compensate for the “Input Latency” reported to the program?

Thanks, niles and TabSel for taking the time to help.

the AES is a digital interface right? Your AD converters are hooked to the AES?

In this case, you might want to add the AD converters latency by manually offsetting the recording by the amount of AD conv latency, as the AES driver can not know, as it does not convert AD, right. That’s what manual record offset is for.

Yes and Yes.

That’s what I planned to do. It’s such a small amount that I wondered if anybody else even bothered to compensate for it. No Biggie.

The thing i really wanted to know, though, was whether or not Cubase automatically compensates for “Input Latency” when recording.

I record in the same way; my Tascam desk is the ADC/DAC and the firewire interface to the PC reports its latency to Cubase. Cubase does compensate for it, though I wouldn’t know (for input latency, that is) since I monitor direct from the desk. The ADC latency is really negligible, usually less than 1 ms. I never bothered getting into it.

This is the part I would like to verify. If this is clearly stated in the manual, please, tell me where I can find it. I really want to be very sure.

It just works, don’t sweat it! the latency is reported to Cubase, Cubase will compensate, if it didn’t work we’d all be complaining like hell about it.

LOL! But how do I know? How can I verify this? I don’t wanna guess or assume. I want to be sure. I mean, I don’t have to measure it for myself, but I’d like some sort of verification that (or reason why) “Input Latency” is compensated for when I record.

To me, this seems like a pretty important issue, but I don’t see any mention of it in the manual…

well, we’re all stupid then!!! and so are the programmers!!!

But hey don’t take our word for it, that would be really stupid.
Latency compensation is just a phrase made up to pull the wool over our em… ears :laughing:

Try page 506, it hints at it there.

Play any Cubase wave file test signal from the output of your converter.
In the real world that is what you’re trying to sync to.
Plug that back into the converter’s input and record it.
Play the two files side by side and reverse one’s polarity.
If you hear dead silence, you’re good to go.

That’ll never completely null, with the artifacts from DA and AD conversion, the cable, the circuitry, possibly a pre-amp on the input etc.

I’m not saying it doesn’t work. What I’m saying is, since I’m reading the manual and I haven’t found anything that that says “Input latency” is compensated for, could somebody point it out to me where this has been officially stated or measured? I know that when I have used external hardware plugins I’ve had to manually tick a button to cause Cubase to compensate for latency, and Cubase usually compensates for the latency of plugins, but does it compensate for the “Input Latency” when recording?

Like mentioned before, you can check by measuring your round trip latency. You can do this by a loopback recording, direct and through your ADC/DAC and compare the files on sample level, or just use a tool like the Centrance’s Latency Test Utility, which is much easier of course (turn off your monitor speakers).

For instance when I do an internal roundrtip (digital loopback within my audio driver) of my audio path at a buffersize of 256 samples and 44100hz, it will take a pulse 547 samples (12.40ms) to go round.
When I deduct 2 times 256 samples buffer size (ASIO input and output) from 547, I’ll end up with 35 samples extra latency which. This consistent with the provided safety buffer specifications by the manufacturer of my PCI-E card.

Now when I do the same test with an A/D included in the signal path, the total roundtrip is 617 samples (13.99ms), so the A/D alone introduces 70 samples (1.5ms) extra latency.
So the safety buffer (35 samples) and A/D conversion (70 samples) will add 2.3ms extra, non reported, audio latency alone.

When I compare the results of my test with the ADC/DAC specifications provided by the manufacturer, it’s again pretty consistent.

So if I truly want to work sample accurate I could adjust my record offset to the latency introduced by the ADC and safety buffer. But personally I think it’s just too small to even bother.

It is interesting geeky stuff to know though :wink:

No it doesn’t, everything you record is out of time all the time. Makes the program completely useless for recording, don’t really know why I bother with it?

Thanks, Split! This will do! From Cubase manual page 506:
“Internal mixing and latency
One problem with mixing inside the computer is the latency issue we mentioned earlier. The VST engine always compensates for record latencies,…”

Also, thanks, niles! I got Centrance and tested my system. That’s a cool little app.