Why is "Offline Processing" and ARA still so unpredictable

I am on Nuendo 14.0.40 and having used it for a year coming from Cubase 13 I just can’t notice how Steinberg is trying to improve “Offline Processing” yet alone optimizing it and making it more stable.

After years, I am still encountering weirdnesses, like crashes out of nowhere (by just adjusting a point in the pitch shifting curve), changes that are not getting applied or just “disappearing” audio clips as soon as I apply an effect (this is actually a new one for me this time).

So what is happening? Why is this not getting better?

Another thing: Spectralayers via ARA - it sounds good on paper and sometimes works for tiny things. But if you start using it on a clip you’ve already edited into chunks, it just glitches out. Yesterday I had Nuendo play back some sections of the audio where there should’t be any audio. Meaning, I cut away some audio from the clip I had also opened with Spectralayers - and still there was audio being played. It’s a total clusterf=ck.

I am very disappointed by this and will try to avoid using both features for the time being.

Here is the “disappearing” bug happening in realtime:

3 Likes

I agree ARA2 is VERY buggy. You have to work with it like you are stepping on eggshells, it sucks.

That being said, IMO nothing bests Spectralayers and Auto Align Post 2, so what I do to be able to work with them is:
I never open more than one event at a time in SL (I do open multiple events in AAP2, but I focus on it untill I’m done and “make permanent”) . I always “make extension permanent” as I go, event by event. I still get a weird glitch here and there, but even then for me there’s nothing better/faster. RX Connect is a pain in the ass too, so…nowhere to run to.

I don’t get as much trouble with DOP as other people here, though. My main gripe with it is that it desperately needs a serious improvement overall. Many important features missing or very clunky. But from the reports, it also sucks.
I have no idea why Steinberg seems to just ignore it.

1 Like

I do use RX Connect all the time and it works a lot better. Except for the stem separation feature, which is just better in Spectralayers.

Hopefully, Nuendo 15 will include this separation feature just like the new C15 already did.

What’s also confusing: RX is being displayed a an ARA extension but it just doesn’t work in Cubendo. Why are they even listing it then? :man_shrugging:

No idea.

I used to use RX Connect, but it was a MAJOR pain in the ass for me. So much that I prefer to endure the many issues with ARA than going back to that!

We work 10 hours a day with “offline processing” of terabytes of data. We restore huge quantities of analog audio recordings. Offline processing is a bit tricky, and you have to understand the system. But if you follow a few simple rules, it runs very smoothly and stably.
However, it apparently doesn’t work quite right with some plugins, as you’ve demonstrated.

On the other hand, ARA is so unusable that we don’t use it. Anything after beat 800 crashes Nuendo immediately.

Hello. Has anyone tried Spectralayers in ARA mode in Pro Tools?

Does it work well or is it just as unstable as in Nuendo?

I haven’t tried it myself, but if it works well, then good. I don’t understand how Spectralayers and Nuendo are from the same company, yet they haven’t made it work better…

I believe this is not a problem between Nuendo and SL specifically. I think it is an issue with the protocol (ARA is not a SB protocol), but I could be wrong (like, perhaprs ARA implementation in Nuendo is flawed).

I say this because AAP2 and Acoustica are just as tricky in my experience, and Spectralayers is a beast of a program that I can imagine being much harder to run in ARA.

But in my case, since I learned to work around the most buggy behaviours, I must say it is reasonably stable. I use SL in almost every dialog clip in any film, and as long as I keep it on only one clip at a time, it usually doesn’t give me problems.
Not unusable by any means.

What are those rules?

If you have any problems, you can ask ChatGPT directly. This is from ChatGPT and I can confirm it:

8 Golden Rules for Stable Direct Offline Processing (DOP) in Nuendo

  1. Work only on single audio events – not parts or group events.
    DOP is event-based and doesn’t behave reliably across combined parts.

  2. Use identical plug-ins for multiple selections.
    Applying different effects to several selected events at once will confuse the DOP chain.

  3. Use Undo only inside the DOP window.
    Global Undo (Ctrl+Z) can break or delete DOP entries.

  4. Avoid cutting or moving events after applying DOP.
    Every edit creates a new Event ID and breaks the link to the DOP history.

  5. Clear the render cache regularly.
    Old cache files can cause errors, especially in large or long-running projects.

  6. Apply effects step by step and click “Apply” after each one.
    Stacking multiple unrendered effects increases the risk of instability.

  7. Use plug-ins that fully support offline processing.
    Real-time plug-ins or those with latency often produce wrong results or crashes.

  8. Save and reopen the project after many DOP operations.
    This refreshes internal links and prevents lost or missing render references.

What are “group events”? Does that mean multiple events?

That is a bug then and should be fixed. There is no argument that doing this plugin-by-plugin is a workaround, a specifically stated feature of Nuendo’s DOP is batch processing.

Bug

Does it? I don’t recall seeing this. How do I test it?

Where can I find the “render cache”?

See above.

What? I’ve had literally zero crashes from applying DOP. I’ve had wrong results with only one plugin so far through version 13.

Really? I’m skeptical of this one. How would this have been tested in real life? To see if this is actually correct you’d have to save before you perform “many DOP operations” and then perform them, see that the system fails, revert to earlier, redo the operations and save/close immediately, and then see the system succeed. And this would have to have been tested multiple times.

Additionally, literally all my problems happened before saving so save/close would not have fixed anything at all.


I was hoping to hear about your own rules, not ChatGPT. I trust it as much as I trust a politician.

1 Like

Thanks. I know you are just a messenger, the following is not a critique to you.

Use identical plug-ins for multiple selections.
Applying different effects to several selected events at once will confuse the DOP chain.

IMHO This makes it totally unusable in a DAW. I really really need to be able to shift a sound around on the timeline.

Film example:
Imagine a line of offscreen dialogue, I want to move it a few frames later.
I also need to do some additional processing with a denoiser because it is from a different take or a wild line.

If I look at these rules, I can’t do that? This will already break the system? I would hope not.

If we have to look at 8 golden rules to make sure our DAW does not do unexpected things (like crashing).. perhaps it needs a re-design.

I rarely use DOP for this reason and keep processing to a minimum and still it fails enough to resent it (just like chatgpt btw :slight_smile: )

.

1 Like

I trust it less. Politicians know they’re lying.

5 Likes

It is impossible for me to write a manual now about everything you need to consider during the Direct Offline process.
But in many cases, ChatGPT was often able to help me. Because many Cubase/Nuendo users are having discussions here.

Just this:
If an effect has been applied to an event in the DOP, it becomes critical if you cut that event down into smaller events. If you then change values ​​in one of the smaller events or delete the effect, the whole structure collapses.
In most cases, this is the cause. And it’s not a bug – the structure of the DOP hierarchy is designed in such a way that it probably can’t be any other way.

The cache is here (Windows):
C:\Users<Username>\AppData\Local\Steinberg\Nuendo \Offline Processes
Close Nuendo.
Delete the entire Offline Processes folder.
→ It will be recreated automatically the next time you start Nuendo.
(This removes all cached DOP render files, but not your audio files in the project!)
Please make a backup beforehand and proceed at your own risk.

What do you mean by “whole structure”?

Sorry, I always have to translate this from German and I don’t know if I’m expressing myself correctly. What I mean is that the entire chain of effects no longer produces the desired result, or the rendering process never ends.
Example: Timestretch an event. Then divide the event into three smaller events. Remove the timestretch from the middle event, and there’s a high chance that the other two events will have lost their timestretch.
This is because the Direct Offline process always refers to the entire event, not the three sub-events.

You can’t really take what ChatGPT says at face value. I’ve never heard of a render cache and not cutting or moving events after appling DOP makes zero sense. We do that all the time without issue. You can definitely apply DOP on multiple events at the same time mostly without issue. I would however wait until that process is finished before applying other events etc. (some plugins this can cause issues on large numbers) - no need to save and reopn project after DOP operations. Imagine that on a film project.

1 Like

Yes. It works around the clock for us too, and as I said, we don’t have any problems. But there are always occasional situations where it doesn’t work because we work too quickly and don’t wait for some processes to complete.
Every week, we encounter situations where a rendering process just won’t finish. This happens when too many effects are pushed into the Direct Offline process too quickly.

Well, it may be so. But to say that this way of working is by design and not a bug, when DOP doesn’t block you or warn you when you are splitting events with processes on them is absurd in my opinion.
And if it really behaves that way by design (I can’t see any problems on my side when splitting events with DOP on them) , a redesign is way due!

We informed Steinberg (Germany) about this a long time ago - and their support team confirmed the problem.

The problem only occurs in this case:
An event is assigned a Direct Offline Process “A”. Then you subdivide it. If you then assign further processes “B C D” to the sub-events, it works. But if you change process “A” in one sub-event, this can cause process “A” to malfunction in all other sub-events as well.

This is noticeable in the fact that processes do not stop rendering.
That’s a fact.

ChatGPT explains that this is because the offline process is bound to the first (uncut) event ** because this is 1 file ** and therefore gets confused if the foremost process “A” is changed in sub-events.
There is no explicit individual offline process for the sub-events.
I can’t verify this explanation – but it seems several people have discussed it, otherwise the AI ​​wouldn’t know about it.

If that’s the case, Steinberg would have to completely rebuild the entire handling of the offline process.

The renders stop for me, but the processes aren’t rendering properly depending on what I do. Certainly undo/redo screws things up significantly.

But regardless, there are clearly bugs and need to be fixed by Steinberg. It is absolutely ridiculous that we still have to deal with this.

2 Likes