This happens to every other composer I know that uses Cubase or Nuendo, on both Mac and PC, and I’ve seen it mentioned on the Cubase forum numerous times: When you load up very large projects back-to-back, Cubendo will very often crash. Apparently it’s been an issue for many years – in front of clients it’s really crappy. Now I just close down Cubendo then open it up again, but this is not palatable with such a very expensive piece of software. I’ve had some of those clients ask me why I have to do that.
That’s simply because you run out computing power and Memory.
It’s a limitation of the computer, so simple as that.
Sorry, I can’t agree with that. This doesn’t happen on other DAWs I’ve used on the same computer, with the same big-sized sessions. I’m not exaggerating – it literally never happened/never happens in Pro Tools or Reaper. Is Nuendo not dumping RAM properly in between projects? If it’s not, then that’s what the problem is. If it doesn’t dump all the RAM from the previous project when switching to a new project like it should then of course it will overload the computer. If you close and open the same session it doesn’t crash – it only happens when you switch to different projects. It shouldn’t be adding to the RAM for every new project (if that’s what’s happening) because multiple projects can’t be active at the same time; if they were, then what you’re saying would be correct.
Anyone else support fixing this longstanding and very reproducible (simply load up very large projects back-to-back to see it in action – sometimes it will happen right away with another very large project, and sometimes after three or four different very large projects are loaded up back-to-back…it does NOT happen at all with small projects) issue?
Again, I’ve experienced or seen it on multiple systems from completely different users on both Windows and Mac…so no, it’s not my system.
I also see this very frequently and do not think that it is a memory/CPU limitation. I close a project, open a new one and crash. Restart Cubase and open and no crash. I’ve just gotten in the habit of closing Cubase between projects.
Re memory/CPU, Cubase should be able to free up all project resources when you close a project. If opening a project after closing one is not the same as opening one from scratch, that’s a Cubase issue, not a system issue.
I also experience this on three different Mac systems. Windows seems to be better.
That is incorrect. It crashes because Nuendo does not release the memory and other resources of the computer in an OS compliant fashion. There are many modules within Nuendo and they are not all coordinated together and compliant to fully release the computer’s resources properly. If they were, Nuendo would not crash upon loading the new project because the computer would have been returned to the state it was in before the previous project was opened.
You are a very advanced and educated user, Fredo. I respect your knowledge, but properly written software does not crash when unloading a project file and loading a different one. It is most certainly a flaw in Nuendo, not a flaw in a properly running computer. To me, it’s an extremely minor bug. As a percentage, closing and opening Nuendo between projects makes a very small difference in the speed of moving between large projects, as the project load time eclipses the Nuendo open/close time. So, for me, it’s far down the buglist compared to other issues. Like number 1000.
Nevertheless, it is a bug in Nuendo.
You make it sound way too easy.
It’s a tradeoff, and there will always be one side that complains.
Nuendo load all of the Plugins upon startup. Means that when you are running the application, there is virtually no hickup for loading a plugin.
The other option is NOT loading all plugins at startup which will shift the waiting process to the point where you haev to load/unload each and every individual plugin.
Secondly, it is the OS which has full control over the memory allocation of a plugin.
Nuendo loads a plugin (at startup), and that plugin (not the host) is “granted” a certain amount of memory from the OS.
So the memory “belongs” to the plugin, which means that Nuendo CAN NOT give the order to the OS to release that memory.
That is, if I am correctly informed, the way it is.
So it’s not a bug, it’s a side-effect of a fundamental choice that Steinberg has made in the past.
Fredo, out of curiosity; does opening Nuendo projects accumulate reserved (for plugins) memory as you open/close them, or do these allocated memory spaces ‘reset’ upon closing?
Not sure if I can answer that, but as far as I remember -I had a long talk with one of teh developers about this issue, but it is long ago- it depends on the plugin itself. Some of the memory is allocated through Nuendo, so that is “blocked”, it can’t be released by Nuendo or the plugin.
The plugin can’t release it because it is called by Nuendo, and the Nuendo can’t release it because it belo,gs to the plugin.
But there is -or can be- another part of the memory which is -or can be- called directly by the plugin, that can be released. For example (I am really not sure that what I say is correct), a VSTi is called by Nuendo, but the Library that is opened within the VSTi Plugin is called by the plugin itself. At that point, it’s up to the plugin to release that memory or not. Some plugins do, some don’t.
Maybe things have changed now with the newer OS, or new technologies, but this is how I remember it was explained to me by the developers.
Hmmm… very interesting. I suppose it’s one of those things that might creep up on designers and programmers over time but might not have been 100% clear at the time a standard was developed - and I mean by programmers of plugins and hosts alike. The most ‘logical’ way to do it seems to me to have a bare-bones minimum memory size a plugin would default to whenever it released loaded content (VSTi maybe) or when a project was activated (i.e. release previous memory).
On the other hand I agree though with you and Getalife; it’s not a huge deal, and with modern computers I would think this shouldn’t really be much of an issue unless one is loading/unloading projects a lot over the course of a working day. My next build will have a minimum of 16GB or 32GB ram depending on the platform I choose, so I really don’t see how I would get into trouble with that amount of memory…
Thanks for the info.
Here is the flaw in the “plugin owns the memory” logic. I have 32 GB of ram. Nuendo will sometimes crash when closing and then loading back to back projects that are fairly large, but still use a good bit less than half of my memory.
Makes it tough to blame plugins hogging memory for the crashes, I think.
The other trouble with excusing Nuendo from fault in thus issue is other DAW software on the same computer with equally quick plugin loading as Nuendo that do not display this problem on equally large projects.
Please take no offense as this is not an important bug in my mind and to me, this is more of a friendly logic debate than anything unpleasant. Fredo, I appreciate the help you bring to the forums.
Ok, so the issue is elsewhere then maybe. I haven’t encountered this myself so I can’t even really start to troubleshoot it I think (no active projects on Nuendo right now of large size). But it would be interesting to see some testing on it.
For example; if a user finds that loading one large project and then switching to another always results in a crash, could they try the reverse? Could they try re-arranging track order maybe (assuming plugins load in some non-alphabetical order)? Could they try dropping a specific plugin from the second project, just one at a time, to see if it’s related to a specific plugin? Could they try dropping tracks instead, to see if it’s a track-number issue, or possibly the amount of audio in the projects (which maybe then relates to waveform images)…?
I think this is a case where we need to do some specific testing ourselves because the reproduction procedure is so vague (“load up large projects back-to-back”).
If I get a minute I will try to get a little more specific on this. I’m obviously on PC from my Sig, have lots of memory, and all my media loads off of ssds so it’s not very time-consuming.
This bug doesn’t really bug me but now my curiosity has been piqued and it would be interesting to get a reproducible case. Problem being that a large project contains so many plugins and virtual instruments that it’s going to be difficult to duplicate on another system I think. Unless it was purposely constructed with generic plugins and instruments. And I definitely do not have time to do that
Memory blocks are allocated/spread over the different CPU’s according a specific protocol.
They blocks can not be swapped from one CPU to another, they stay “locked” in place.
When loading a second project, it can very well be that one porcessor is totally overloaded (combining the two projects) , while other processors are cirtually “empty”.
It is extremely complicated to have control over things that are basically controlled by the OS.
I’m actually not so sure that’s the case Fredo, but even if it was it’s going to ultimately depend on what the CPU and motherboard manufacturers decide to show to the OS.
If you look at the recent Threadripper architecture by AMD it’s actually two Ryzen dies on the same CPU package. You get the option to show a uniform setup to the OS meaning it will see one CPU with 16 cores and attached memory, or you can show the OS two CPUs with their own paths to separate memory - but even then the memory can and will be shared if necessary.
Most people don’t run multiple CPU systems anyway so it’s mostly a moot point.
For the sake of clarification, are we talking about:
- loading a second project and Activating it while the first is still loaded.
- Closing a large project completely and then loading another large project.
Fredo, your last post seems to indicate you are talking about Number 1 above. The OP was talking about Number 2 above.