How to Activate GPU Acceleration in Cubase for SpectraLayers 11 Used as ARA Extension?

Hey everyone,

I’m using SpectraLayers 11, and I’ve had great success with GPU acceleration when running it as a standalone application. In the preferences under Miscellaneous, I selected my GPU, and stem separation is significantly faster, which is exactly what I wanted.

However, I’m having a problem when I try to use SpectraLayers 11 as an ARA extension in Cubase. While I have no issues running SpectraLayers as an ARA extension, I thinkthe GPU acceleration isn’t working in this mode.

Here’s what’s happening:

  1. The processing time in Cubase is much slower compared to when I use SpectraLayers as a standalone application.
  2. When I monitor my system resources with a GPU profiler, it shows that the GPU isn’t being used at all during processing in Cubase.

Even though I’ve activated GPU acceleration in SpectraLayers’ preferences, it doesn’t carry over when used as an ARA extension in Cubase.

I couldn’t find a solution, Any one knows how to solve this ?

More details
I’m using an RTX 4090 graphics card, and there is a very odd thing happening. When I run SpectraLayers as a standalone application and select my RTX 4090 as the main GPU, everything works perfectly—especially during stem separation. However, when I use SpectraLayers as an ARA plugin in Cubase, something odd happens: Cubase defaults to using my integrated graphics card (the one from my CPU) instead of the RTX 4090.

What’s even stranger is that if I switch SpectraLayers to CPU mode, then Cubase also uses the CPU for processing when SpectraLayers is used as an ARA extension. So, when I set SpectraLayers to use my main GPU, Cubase somehow ends up using the integrated graphics card instead.

Now, here’s the strangest part:I found a workaround. If I select my integrated GPU in the SpectraLayers settings (so that SpectraLayers uses the integrated card in standalone mode), Cubase suddenly starts using my RTX 4090 as the main GPU when SpectraLayers is used as an ARA plugin. It’s the weirdest thing I’ve seen, but at least it works. I have no idea why this behavior occurs, but I’m glad there’s a way to make it work.

Do you have the option of deactivating the integrated graphics card in the UEFI or in the device manager?

Hi ASM,

Thanks for the suggestion. I tried your alternative of disabling the integrated graphics card. After doing so, the integrated card was indeed disabled, but a new issue happen. In SpectraLayers, under the preferences for AI Processing Device, I previously had three options: CPU, integrated card, and RTX 4090. Now, with the integrated card disabled, there’s a new option instead called “Microsoft Basic Render Driver.

Assuming that the integrated card won’t appear in the list anymore, I switched the setting to the RTX 4090. However, when I went back to Cubase, the same issue persisted. Cubase has now chosen the “Microsoft Basic Render Driver” instead, because when I open SpectraLayers as an ARA extension, I get an error message saying “OpenGL 3.3 not supported. Update your graphic driver or upgrade your graphic card.”

I am sure this error happen because of the “Microsoft Basic Render Driver” option in SpectraLayers. My intuition and conclusion are that Cubase is reading the graphics card that is not selected in SpectraLayers, rather than the one I set in the preferences. It’s a very strange behavior.

The latest path for SL11 will not look for any OpenGL driver, but can I suggest you move your post to the SpectraLayers forum? (click image and choose SpectraLayers) where Robin, the developer of SL, will be best able to answer?

1 Like

SpectraLayers 11.0.20 Maintenance Update

Maybe this will improve something…