If I put waves clarity vx pro to batch processing chain it limits processing to 1 or 2 cores and it takes forever to process. With other brand plugins It uses all cores and works fine.
Is this a bug or are waves plugin limiting the usage of processors?
Unless you select “one task” as option in the batch processor, WaveLab allocates more plugin instances. If this plugin consumes lots of memory and core, you might come to an overload.
But again, if you select “one task”, there is no reason you can’t use Clarity in the batch processor.
I have 64 core threadripper and with clarity vx pro it uses only 1 or 2 processors. if i put for example izotope rx, it uses all processors. So i think something is wrong either in clarity or wavelab.
Since I have Clarity, I could test if with my 32 core threadripper. I have found the problem to be with Clarity, but at the same time, I have found a workaround.
What I have found, is that it works correctly, using all the cores (as many as the files to process), except that the first file is not properly processed.
What I do:
- I set the batch processor to use multiple cores.
- I put the Clarity plugin in the batch processor.
- I put multiple files to process.
- I trigger to run. If you observe the WaveLab task window, you can see the normal progress of multiple files in parallel. But at the same time, you can see the 1st file (and only it) was processed almost instantly. This is as if the plugin was bypassed, actually.
The workaround is this: put a Clarity instance in the Master Section. Its sole presence solves the problem. This is like if Clarity’s first instance needs to be activated, and then a delay needs to be observed. This does not happen with other plugins.
I use it on watchfolder so this workaround does not apply to it?
I have done some more experiments. From the programming point of view, this Clarity problem has all the aspects of what is called a “race condition”. In non-technical terms, “the plugin thinks it is ready, but something it depends on, is not”. Apparently (?), there is an inter-process communication between the plugin and some other Waves software component that is involved.
If WaveLab adds a delay of 100 milliseconds before starting the processing, then the problem disappears, and Clarity is ready to process properly.
For the next WaveLab 11 minor update, I have added this non-noticeable delay (100 ms before any offline rendering) to all plugins manufactured by Waves (because it seems Clarity is not the sole Waves plugin using the underlying multicore technology of Clarity).
But I have also experimented with what you reported at the beginning of this thread: the more you process with Clarity in parallel tasks, the worse the performance becomes.
In other terms, it’s faster to process each file one by one (use the One Core setting in WaveLab’s batch processor), than to process multiple files through multiple instances of Clarity.
This is like if all instances of Clarity are concurrently accessing a single resource for their computation (only one Clarity instance can process at a time), which adds overhead.
Of course, this is a Clarity limitation (or bug?) because WaveLab’s batch processor has, on the opposite, all the power to process files in parallel to use as many cores as possible (if this setting is set in WaveLab). This can easily be experienced with other plugins.
Would that 100ms delay also “fix” the issues from the other thread with Acoustica plugins?
I don’t think so.
I have discussed this with Waves and did further experiments this afternoon.
- Update to the latest Clarity, as I did, because I can’t reproduce the “need a delay” problem
- The Clarity engine is not optimized for AMD Ryzens. Intel CPUs work better, though not extraordinarily.
- The Apple M1 Max shines totally and works 100%, and does what we should expect. I know this does not solve your problem, but I need to share this info.
I downloaded the newest version of the plugin and even run run MKL optimization program. No difference. it still uses only 2 cores. Did they say they are doing something for correcting this?
Yes and no. Because this plugin depends on a third-party module, and the problem comes from there, out of their control. But I guess this module will be updated with time.
Ok, thanks for the answer.