I am running a Setup with Cubase and 3x Universal Audio Apollos, plus Aurora Lynx as ADAT, externally clocked for a total of 40 I/O.
If I run a loopback test on Out Channel 1, and back into Channel 1 for example, and getting a delay of 30 Samples, does this mean I should manually adjust recording latency for 30?
How do I know what delay comes from the output vs input, or does that matter when making this adjustment?
No, I don’t think so. The loopback test tells you what your total latency is. Of your AD/DA converter plus your soundcard buffer. I see no reason to adjust for that unless you use multiple converters with different latencies. Then it makes sense, for example if you use them together in a live environement.
But you can only delay the “faster converters” if you want to line up for monitoring trough your daw with the same latency for all converters. It’s impossible to speed up the slower ones on a live input.
An example; 64 samples buffer and 2x 64 samples (AD-DA) converter delay, so 192 in total will give you a roundtrip latency of 4,4 msec@44100khz. This is approximately 1,30 meter distance in real life…And phase can be only an issue if there is crosstalk or coherence between the sources.
So using the faster converters for mic inputs, and the slower ones for line inputs could be a solution too. Bas mic and d.i.(=coherence) is an example of something I would not put trough different converters…
Also for this reason:
Converters translate analog in to digital and vica versa. But they all do it different. So if you record the same source through two different converters, it will almost always be impossible to null the results completely (= low coherence).
The reason would be…say I use analog gear to process my tracks after recording them. Sometimes I may run things out and back a few different times. So each time there is some delay that cubase doesn’t know about and my tracks are getting shifted all over?