RipX Daw Pro vs Spectralyers speed test

Downloaded demo for RipX Daw pro. Loaded a 3 minute song into the plugin version via ARA inside Cubase 12 Pro. Took over 2 minutes to unmix.
Did the same test with Spectralayers Pro. Took 8 seconds at best quality.
Far as I could tell in the quick test, the quality was equal.
With RipX I also had to download CUDA toolkit to enable the gpu.
Tried the standalone version and it was same length of time. So for me, Spectralayers was about 15 times faster. Eddie


Also RIPX requires a more up todate GPU , they say as a min RTX 3070

1 Like

My test was done with Nvidia 3080ti, 10 GB onboard ram. They can be found used now for around ÂŁ500

In SL10 Help|About, it says the ONNX Runtime is used, which takes advantage of many more GPU types, and is not limited to Nvidia products. It can even use the integrated graphics of Intel CPUs.

RipX on Windows, on the other hand, is currently only able to use Nvidia CUDA. That’s going to add a lot to the price for anyone who doesn’t already have an Nvidia GPU, making SpectraLayers much more attactive in price.

Does this mean that even with CUDA, RipX was taking 2 minutes? In that case, SpectraLayers is blowing it out of the water and requires less hardware!

The requirements for RipX suggest “NVIDIA GeForce 1070/1080/1080Ti/2070/2080/2080Ti/3070/3080/3090/40XX with 16 GB RAM”. Good to know it will work with 10GB, but I wonder does that affect the performance under RipX?

As far as I know, ripx was using Cuda. I installed the Cuda toolkit as instructed. A friend tried same test on his dinosaur PC and it took 15 minutes.

1 Like

Also FYI SpectraLayers only requires a GPU with 4GB RAM :slight_smile:


It also works with the GTX 1080 8GB in my system . I have also been using RipX for over Two Years and SL since version 4. Both work well and also have Vega 64 in the same system. Hopefully the next version of Specralayers could use Multi_GPUs.

1 Like

Time to process is all relative to settings used all things otherwise equal. I believe both RipX and SpectraLayers are using the same Demucs network but they’re probably not both using the same prediction shifts and overlaps settings or even model size. Each prediction shift increase adds time to process and then much above .8 overlap settings will be exponentially time consuming. e.g. I can run the same song with 1 shift and 0.25 overlap and it’ll take 10 seconds. Change it to 20 shifts and .99 overlap and it’ll take all day, it doesn’t necessarily mean I will notice much difference in the result, but I would imaging RipX have chosen to use higher settings than SpectraLayers that’s all.

RipX Windows uses CUDA if you’ve got CUDA, ONYX if you don’t. I’ve not really understood why the need for the CUDA Toolking with RipX, it’s just htdemucs and htdemucs6s models it’s running same as e.g. UVR which doesn’t need it to run on CUDA, but my understanding is RipX is more a MacOS development ported to Windows so probably lacking some tools/knowledge there or they need a particular feature only the toolkit alows maybe?

1 Like