OK, riddle me this? Why does engaging the CDC for a mixdown session reduce the ASIO-Guard peak in the Audio Performance meter, along with reducing the output volume? The volume issue is very similar to tossing a blanket over the monitors. I have 15 total plugins loaded onto 5 tracks, one plugin on an FX track, and one plugin loaded on the Stereo Out bus.
Doesn’t Constrain Delay Compensation disable any plugins that add latency? Assuming that is the case (pretty sure it is), taking plugins out of the loop would reduce the load on the ASIO-Guard side of things. And, of course, taking plugins that affect the volume of the track or bus they’re on will also change the volume of those tracks. So, for example, if you had Ozone on a stereo bus (and mastering plugins like Ozone almost always have a fair bit of latency), suddenly you’ve taken that mastering processing out of the loop, and the “make it louder” portion of the effect that Ozone is having on the mix is no longer there.
The Constrain Delay Compensation is really meant for tracking, so you can track without the latency these plugins introduce.
Rick, you could be correct here. To be honest, I have never used this feature as my PC is pretty fast along with my interface. But for some reason I thought this feature was in part responsible for time alignment while tracking - and here I mean something different than turning off plugins on the tracks being freshly recorded but instead something that kept the track being newly recorded in alignment with the pre recorded track being played to.
Well, surprisingly the session went well, no timing issues - and this must be because of the low latency of my interface and not anything related to the CDC feature?
You could have the fastest PC in the world, but, if you are using a plugin with really long latency when tracking, it could make the effective latency difficult. For example, some mastering-type plugins might have latencies of 300 ms, which is a third of a second, or even more in some extreme cases (I think there is one I use occasionally that was upward of a second in some configurations). That could be difficult when tracking.
I mostly don’t track through plugins (and disable any long latency plugins that might be in my project while tracking, which is actually pretty unusual anyway since I separate mixing from the recording side of things in most cases, or maybe render a submix of tracks to track against in certain cases since my CPU is not all that fast, having been sort of at the knee of the price/performance curve in late 2014). The key exception is if I’m tracking a virtual electric guitar and want to have it going through an amp (and pedals, etc.) simulator, where the sound of the signal chain may be greatly affect how I play the parts I’m tracking.
There is a setting you can enable in the MixConsole that will tell you how much plugin latency (this is different from the audio interface latency, and adds to that audio interface latency when tracking) is in the plugin chain for any given track. I believe the CDC control disables any plugins that have non-zero latency, and you should be able to tell by the status of such plugins when using the control, which ones have been disabled. (Note, though, that I’ve only used that control a very small number of times, so I may be remembering incorrectly on the visual appearance when using it.)
As for aligning new tracks with previously recorded tracks, in most cases, Cubase should take care of this automatically, based on its knowledge of your audio interface latency. However, I’m pretty sure there is also a preference setting that can be used to adjust that automatic compensation if it isn’t getting you the results you want. For example, if you tend to delay your playing to what you are hearing, and have some response time, it could make sense to have it slide the tracks a bit earlier to compensate. (I think I have a tendency to push the beat in my playing, but I tend to adjust later if needed, and it’s actually fairly rare for me to do that.)
Rick, thanks again for your input again. I also use izotope plugins, too, and as you say, they do have ‘built in’ latency, something I don’t think about. Until I do.
Looking back on my post here, I realize that a couple of things happened that created my initial question about the CDC process. For my project I had recorded first a click track, then a bass track, and then a rhythm guitar track. When I brought the harp player/vocalist in, at that point I turned on the CDC - something I never do. I did it more as an experiment actually, to see if it created a better timing result for the harp player’s rhythm part. Everything sounded fine so I left the CDC on and recorded his vocal. Then I came back and had him play his solo harp parts. After that I did a quick mixdown using some compression on the bass track and guitar track, his harp playing and his vocal, adding some verb, too, and finally I put Ozone 11 on the stereo bus. A basic quick mix to send him home with something to listen to.
All the while I didn’t think much about the fact that I left the CDC on. When I came back to the project the next day, I spent a few hours editing the harp solo and the vocal - forgetting the CDC was still on. So, maybe 12 plugins later I have an acceptable rough mix.
At this point I finally notice that the CPU load has kicked up slightly above 25% and I typically have never seen this. Hm. So I’m looking at the project window and I notice that the CDC button is on. As I turn it off I see the CPU line drop to 15% and hear the project volume drop. ….And then I posted my question. Obviously your suggestions were on the money.
Let me close this post by saying that your comments are now part of my education regarding use of the CDC process, Rick. Thank you for taking the time to set me straight on this.
Interesting. This is actually counter-intuitive, in that I’d think the CPU usage would increase when turning CDC off (since that would re-enable plugins that had been disabled due to latency considerations, and often such plugins will be on the heavy side CPU usage-wise).
In any case, I’m glad you now have things resolved.
I understand what you are saying, Rick, and somewhat agree with your last point. However, my conclusion here is that - whatever the way the user might choose to use CDC - it was designed for tracking not mixdown. Running latency heavy plugins in mixdown and upping the interface sample rate is the norm here. So, this said, it makes sense to me to that if you did use the CDC program in mixdown by mistake (like I did) (and use a low interface sample rate, as well) then it makes sense to me that the CPU number would go up as the processor tried to make the process work. And this is what I experienced. I been schooled.