Here is a (edit) possibly apocryphal explanation about how it happens that your CPU load might be only at 40%, but your Cubase Performance meter is topping out to the red.
I saw this video when it was uploaded. Really very informative.
Recommended for everyone…
Thank you Steve for posting this.
Thank you for sharing!
I tested the DPC time and kept it below 500 at all times.
However, Wdf01000.sys (0.5ms) was analyzed to be exceptionally high. Is this a normal?
Is there a problem with the performance of my audio interface?
wdf = windows driver framework
first: Connect your audio interface to another port ( from USB 3.0 port to USB 2.0, purchase only high quality USB cables )
second: Update all drivers on the system (graphics card, network USB, audio). You should not see any warnings in the Device Manager (usually a yellow warning icon).
What it really comes down to is that DAWs require a minimum number of kBs per millisecond, and not a maximum number of MBs per second performance.
However, compared to a lot of other applications, the maximum throughput rates required for audio aren’t all that great. They are relatively low quantity, but with finicky timing.
CPUs are generally specified to impress people whose tasks have throughput rates that are measured over seconds, not milliseconds.
This is also why maximum drive transfer rates are not a good indicator for DAW use. SSDs basically solved drive-based DAW issues. However, beware of using disk speedup tools, like Samsung’s Magician software, which theoretically can make latency up to 30 milliseconds – atrocious for DAW use.
The basic problem with drive specs is that they rely upon saturated drive queues for maximum throughput, but that means that everything is processed with maximum latency. DAWs work better if drive queues are mainly empty, and never saturate.
Its all based on queuing theory, so imagine waiting in a bank queue. The bank would prefer 100% teller utilisation, so would prefer that there is always a queue, with maximum throughput when the queues are always full. However, you as a customer, would prefer not to wait, but be served as soon as you [randomly] walked in the door. You are like the DAW, but the bank is like the drive manufacturer. Your performance requirements are different, so don’t use the bank/drive’s specs as suitable for your requirements.
Very well articulated, my friend.
You’re welcome. Thank you.
OK, I’ve got a twist on this that doesn’t make sense to me.
I’ve got a song that is 4:48 long and plays back just fine - no clicks, VST performance monitor mid-range, no real-time issues at all. I started exporting the audio and it seemed kind of slow. So I exported it again and timed it. Turns out it took 6:08 to export a 4:48 song. Why would that be?
It might be as simple as that it doesn’t need to rush and strain system resources like it does when its ‘on the clock’.
In particular, it might be in a non-blocking mode, where it is not outright prioritising resources for playback, but allowing them to be used for other purposes.
Some plugins may behave differently under realtime vs offline mode, assuming the latter is for the important high-quality rendering for end-product output, compared to optimising for on-demand delivery for tracking.
On non-busy tracks, exporting can take less time, because it is not constrained by timing to space it out.
I think 99% certain the above. All plugins get a signal that they are rendering in stead of real time playing. Some switch there algorithm from good to perfect, or oversampling from 2X to 16X etc. Some plugins have options in their settings for it.
Would be interesting to know which one is causing it in this case.
I have seen extremely speedy or slow exports too, which are not always related to the CPU/asio load.
Yeah, that makes sense.
This video tells only half of the Story.
Cubase has an inefficient audio engine. So when you hear pops and crackles, it is not necessarily because the real-time performance of your computer is bad, but because Cubase is not able to exploit it sufficiently. Reaper demonstrates in comparison how efficient an audio engine can be (but Cubase has other areas where it outperforms Reaper).
When I export a track, my CPU is never higher than 15-20%. I do not have many audio tracks but a lot of FX and VIs, so we are not talking about SSD performance. My RAM is fast. The explanation in the video does not cover this in any way. According to the logic described in the video, and given you do not have a non-realtime bottleneck, exporting should always max out the CPU at nearly 100% (as in the video rendering example). But that is not the case. See below.
The actual problem is often not the real-time performance of the computer but the latency introduced by the plug-ins in the longest signal path within the DAW. If channel and bus latencies add up to 800 msec, it is not possible to render anything in real-time. This is also the reason why the “DAW Performance shoot-outs” are silly when you put the same FX on parallel tracks again and again. This only shows how many channel processes can be calculated in parallel. In contrast, you will quickly kill every DAW if you have a complex FX chain on one track and then add a complex FX chain on the bus. Your VST meter will max out even if your DAW could handle numerous other tracks in parallel. This is also relevant in offline processing: The whole system can only move on to the next chunk of audio if the last chunk has been calculated completely (at least in Cubase). So if every chunk needs to go through a complex master FX chain that cannot be scheduled in several parallel threads but needs to be calculated sequentially (one plug after another), the rest of the CPU / cores is waiting for that one process to finish. In these situations when you have a complex master FX chain, more cores are needless. You would need one core (the one doing the master FX chain which cannot be scheduled in parallel) at 5 GHz while the others could run at 2 GHz.
but that’s exactly what the video tells…he speaks about daws too
I’m sure Richard Ames had good intentions when he made this video, but it is complete and utter nonsense. He knows a lot about Windows computers, but nothing about VST. He noticed a few things about his particular setup, which is rather unusual, drew some incorrect conclusions, then made things worse by overgeneralizing to assume this is relevant to other DAW users.
He completely misunderstands the reason buffer size affects performance, does not understand the transient nature of cpu overloads, apparently is ignorant that his plugins run differently when rendering for export, is apparently completely unaware of the importance of streaming performance in his particular setup, and is dead wrong when he says cpu speed is a minor factor in DAW performance for most people.
- His reasons for the buffer size affecting performance are wrong. The reason is the overhead of processing each buffer is amortized over fewer buffers at a smaller size. His “window of opportunity” idea is pretty funny though .
- His overloads are transient. They occur less than 100% of the time. That means load values displayed in his meter, which are sampled over a period of time, will be less than 100% even when cpu overloads occur.
- The plugins in his test run in a completely different manner when rendering for export due to different streaming requirements in that mode. So he can’t use that to compare the cpu load in those two different scenarios.
- I don’t know where to begin explaining his incorrect reasoning in assuming cpu overloads generally experienced by users are due to reasons other than cpu capacity. Sure, there are unusual cases where that’s true, but that’s, well, unusual.
- He seems to completely miss the point that the scheduling of multiple cores makes it difficult if not impossible to draw the inference he draws about whether he has sufficient cpu capacity. All it takes is one thread on one core missing a time slice schedule to get an overload, even though that core may be under-utilized over a span of several hundred msec.
- His swapping out the boot drive example makes no possible sense. There should be no I/O at all to that volume during his test. He should have realized that and investigated why he got unexplainable results, instead of presenting it as evidence that his cpu capacity was not significant in handling the processing load. I suspect there was something else happening that he overlooked…perhaps he’s encountering pathological paging for some reason. Whatever it was, it was almost certainly something specific to the use of VEPro.
I think the bottom line is: he is providing an explanation for a rather unusual experience, one where a machine with a more powerful cpu doesn’t outperform an less-powered one when using VEPro. Somewhere during the video, he turned that into trying to explain why, in general, the performance meter is maxed, but the cpu is not. Those are two very different scenarios. His explanations for the former do not, in general, explain the latter.
Thanks for that- there’s a bit to unpack… I’ll be back
Like I mentioned, I think he made the video with good intentions. But, he seems completely unaware that VEPro has performance characteristics that are very different than virtually every other plugin, so any conclusions he comes to will not be relevant for anyone not using VEPro. And, to make matters worse, the conclusions that he comes to are rather odd and not supported by the facts anyway. There are answers to your question about “Why is the performance meter maxed, but CPU not?”, but they are not to be found in that video .
How come in other DAWs the ASIO performance meter is the same as cpu meter ?
The comments in the video about the system cpu meter are misleading. There is very little relationship between that meter and the amount of cpu in use for processing audio in your DAW. In any DAW, if those two are the showing the same value, that is just a coincidence.