When I execute the test the results are: 414 samples/9.39 ms.
This looks correct as well.
If the math that Bredo suggests is valid, 414/64 = 6.468. If I then divide 9.39/6.468 = 1.45 . ?? So this means my actual RTL is 1.45 ms @ 64 samples/44.100Hz?
Have I screwed this up, is the sample rate/ms lantency NOT reducable via a common denominator?
Now here’s where you went wrong. The 414 samples (which is equal to 9.39 ms as reported by CEntrance) IS your RTL. There’s no need to go further than this, unless you also want to find your hidden buffer size.
What you did was divide the total reported RTL sample size of 414 by the buffer size of 64 samples, when what you should’ve done was divide 414 by the sample rate (44.1KHz in this case). That’s why you got a lower number (6.468 ms instead of 9.39 ms). Again, the correct formula to find your RTL is:
(Input Buffer) + (Output Buffer) + (AD/DA latency) + (Hidden Buffers) = (RTL)
For example, if you set your interface to 64 samples of latency @ 44.1KHz you would get the following:
(64 samples) + (64 samples) + (~88.2 samples) + (???) = (RTL in samples)
In your case, this was a total of 414 samples. To get how many milliseconds this is equal to you need to divide your RTL in samples by the sample rate:
414 samples / 44.1KHz = 9.39 ms
If you want to find the Hidden Buffers you can use the following formula:
(RTL) - (Input Buffer) - (Output Buffer) - (AD/DA latency) = (Hidden Buffers)
The only way you can mathematically calculate your latency at different buffer sizes, without running CEntrance at each buffer setting, is by knowing the exact figure of your hidden buffers. Otherwise, you will get lower figures than what your real RTL is. Is this why you want to find a common formula (to avoid running CEntrance at each buffer size)? Maybe I’m not understanding what you want with this common denominator you ask of.