WL11 Render Issue

And you have used the batch processor exactly as in my picture?
There are some automated tests running on WaveLab, almost every day, where results are checked with bit accuracy. And resampling is part of them. This is why I much trust this process (for many years).
Please try again, and if you get the same problem, I wish you send me your file.

I have to draw a line. If/when I get time to bug test WL11 I will but I have wasted an enormous amount of time and I’m days behind on this project.

Sorry for laughing out loud here, but this does not add to your credit… :stuck_out_tongue_closed_eyes:
Also, you may not be aware, but PG1 is the developer of Wavelab, and if anyone can help you it is him. If you have time writing nonsense about Seqouia’s better sound quality, you might as well take 2 minutes to send your problem file…

1 Like

With all due respect, you are in no position to dish out ‘credit’. Not all upsampling is equal (I suggest you do some research before making smart arse comments). Just because YOU can’t hear something does not make it so! Go try it yourself (there’s a free demo of sequoia 16 available). You may need to upgrade your monitoring and d/a setup first as you probably can’t hear the difference between 44.1 and 96k on your advertised monitoring system.

I’m also aware who PG1 is. I can (and do) use google. I’ve gone back to the software that works for my needs. It’s a shame, wl11 has many features that would work for me but the cons outweigh the pros.

When the track is released in November I will gladly send the file but I’ll let you beta test going forward. :kissing_heart:

1 Like

If the new Sequoia resampler is using linear phase and standard range settings in your usage, there should be, (in my experience testing), no audible difference between the Wavelab resample and the Sequoia resample even in a null… Null the Wavelab upsample to the Sequoia upsample and listen to the difference. From my testing with other current high quality resamplers, there should be nothing at all to hear in the null if linear phase is used in both and other settings are in standard ranges.

Then do a blind test to confirm or refute that.

(just to note, two resamplers, Saracon and Crystal Resampler, always require subsample time alignment in order to null to others. I don’t really expect that here, but If it does happen, just do the null in DeltaWave Null Comparator instead of a DAW. DeltaWave will do the subsample time alignment automatically).

If the new Sequoia resampler is using minimum or intermediate phase in your usage, the upsamples won’t null. In that case do a multi-trial blind test between the two upsamples using Hofa Blind Test, DeltaWave Null Comparator, or Foobar2000 ABX to see if a difference actually can be repeatedly discerned. If it can, which upsample is “better” might be a matter of opinion. The phase response of the linear phase upsample (in this example, Wavelab) will be like that of the original file, but the minimum phase upsample won’t, and there will be an audible difference in the null. The minimum phase version might sound better to you, in the way it is sometimes said to sound better in MQA, but the phase response is different than the original file.

I asked a while ago (FR: Resampler Parameter Controls. )
but i’d like to ask again that the SoX built in resampler parameter controls be made available to the user in Wavelab so we can compare these things realistically and also have settings options besides just the 4 linear phase presets.

Pyramix has a minimum phase option, Izotope RX has always had complete parameter options, and Audirvana and Foobar2000 have complete option controls for their SoX parameters, as does the SoX command line program. Audirvana even has a “max filter length” control for their SoX l’ve never seen in the SoX commands.

If other mastering programs like Pyramix (and maybe now Sequoia) are offering controls or options for phase, bandwidth, and aliasing, it seems like Wavelab should too, espescially since the commands already exist within SoX. An added “Custom” selection on the existing resampler menu that would open a small window and allow us to adjust 3 or 4 of the SoX parameters would be great and would allow Wavelab to cover all of the options that Izotope does.

Then we could do minimum phase, or a gentler filter with aliasing like the Izotope default setting 32-1-1, which is exactly the same range as Saracon and probably chosen because of Saracon, or the steeper filter 99.7 I would hope (as Izotope can do), and everything in between. Just everything that can be done in the SoX command line program that can’t be done with just the few fixed presets in Wavelab, although they’re great too.

Is this necessary?. Maybe not. I might return to the “Best” preset setting after any future testing, but I don’t know that for sure.

But all of these controls were added to the SoX program, and should be made available like they are in Izotope imo.

I’m having issues with rendering in WL11.
While surfing the web for solutions/reasons I ended up here because of similar reasons.
I think some plugins just get bypassed (or parameters not copied to the rendering instance).
I rendered in realtime, but the issue continues.
Next step, find which plugins are causing the problem.
Further reading

Makes me really sad having to waste time on this…
And, and makes me look bad in front of the people I work with, they don’t care about bugs, only the music…

Moreover, I had some clients complaining about shorter beginnings of songs, but that’s for another post.

OSX Monterey, Mac mini M1, WL11, VST3

@dcocharro

99% of the time, any rendering issues are something that the plugin developer needs to fix.

The only known WaveLab rendering issue right now is that if you have DYN enabled in the Preferences for any VST3 plugins, they will not render if used as Clip FX. This was discovered last week and will be fixed in the next update.

Otherwise, general rendering issues for random plugins 99.9% of the time have to be fixed by the plugin developer.

I have a solid core group of plugins that I have tested and know work correctly in WaveLab, and I re-test them now and then and I also make sure to check outgoing files for issues which in the world of mastering is part of the job. Quality control.

I would try to narrow down one by one which plugins that you are using to see which one(s) are not rendering correctly, or at all.

I’ve had issues fixed by DMG, Sonnox, Goodhertz, iZotope, FabFilter, Eiosis and a few others but none of those issues could have been fixed by WaveLab. It’s within the plugin where the issues occur.

I once had the attitude that if a plugin works in Cubase or Pro Tools it should work in WaveLab but after 10+ years of using WaveLab, I have learned that isn’t true and that some plugin developers are more willing than others to properly test and support WaveLab as a host for their plugins.

1 Like

My humble experience with Sequo-a (and Sequo-a and I didn’t get along all that well) is that it can sound subjectively ‘better’ or at least ‘different’ when listening to it. Sure, the renders null perfectly from WL etc. But when listening to it, to me it kind of very subtly pushed something forward, like a bump. Maybe much the same phenomena that was reported by Sonic Solutions users in the early days that boasted about pristine playback quality…

For some I guess this helps them make good decisions. Others (like me) maybe prefer the playback transparency of WL.

Not all will agree with my highly subjective observation but I can say that I know that I am not alone in coming to that conclusion.

My guess is that this is what ondre was referring to.

1 Like

I also always test plugins and tend not to use them for at least a month while I get to know them and am completely satisfied that they will integrate without issue both in the workflow and WL.

This applies to my core group of five plugins. All have been in use over various versions of WL. They load correctly, behave correctly, the GUI displays correctly, they have never crashed WL. In my humble opinion they are all coded by developers that definitely know what they are doing. I am in contact with them as well. When the plugs are updated I always re-test them and a few times have stayed with the previous version until a possible issue has been fixed by the developer. In the past there have been zero render issues.

In that context, I need to understand why … very infrequently and completely randomly … renders are ‘incorrect’. Not only can I hear and ‘see’ this, but a render in 9.5 does not null as it should.

I am always rendering from a montage and nearly always the plugin chain is on ‘Output’ so no potential for Clip issues.

Perhaps the answer is that WL 11 deals with plugins in a subtly different way and something in, say, a Sonnox plug (which I have been using since the Sony Oxford days by the way) might randomly fail to send a plug in - state message to WL causing it to ‘switch off’ when WL makes the duplicate render montage in the background. I understand that montage was modified in WL 11 to give priority processing (or something similar). Something’s obviously changed or a saved chain would be backward compatible.

If it’s that simple well and good and I’ll either try and find a plugin substitute or revert to an earlier WL version until I sort it all out at my end.

The same plugin chain continues to work flawlessly in 9.5 and I cannot reproduce the issue in 11 at all let alone reliably.

WL 11 is amazing and to allay any doubt I am not criticizing it. I just want to understand ‘why’.

2 Likes

Thank you all for your replies, I’m 100% with you guys.
Definitely turning “DYNAMIC off” in plugin settings solve one of the issues, the other one needs further investigation, has at the moment I don’t understand enough the problem to be able to recreate it.
@Justin_Perkins you just gave me an idea, it would be interesting to have some kind of communal google sheet document were each user could report buggy plugins in WL. This list would call attention of other users to be careful with those problematic plugs and moreover to call the developers attention to fix those bugs and to be aware that there’s a community of WL users that uses their plugs.

About testing plugins, what strategy to guys use generally use to test/inspect your plugins in WL?
I was thinking about preparing a project/montage with some files, namely a pink noise, a sweep and perhaps a dirac impulse. Then apply some processing with one plugin and the inspect the rendered file for differences (e.g. duration, spectrum, etc.).
Any other suggestions?

Thank you all,
Diogo

Hi

For me, and of course others will have different approaches, testing would include:

  • insertion in master section and press play on a 32 f 96 kHz file. Was there an unexpected delay in the start of play? Does it crash WL? Does it look like it’s working as expected ?
  • Does the GUI display correctly? Are all elements of the GUI responsive and behaving predictably?
  • Can you load, save and recall presets?
  • If it is inserted with other plugs from your ‘starting point’ chain, does everything play and sound like it should? Are the meters working as expected.
  • What happens when you by-pass the plugin?
  • What happens when you remove the plugin? Does this cause WL to hang or crash
  • Insert in the montage and basically repeat these steps
  • I have a pink noise file (similar to Justin’s test montage). I would render this as well as a ‘real’ audio track and carefully inspect the render. Has it cut off the beginning? Has it rendered correctly through markers? Is the phase still as expected ? What does the tail look like? Listen to the file … dropouts or glitches anywhere?
  • if you intend to use the plug in a batch processor, basically repeat that process there.
  • Typically, I see what’s happening in Plugin Doctor, although that will not always be a guide to WL behavior and sometimes is not that useful.
  • This is subjective: audition the plugin over a series of sessions and time and see if it adds anything. In my case, I am always surprised at how few plugins I buy actually make the cut.
  • When you shut down your computer, does anything unexpected happen (I, and others, have experienced hangs after installing a version of Waves for example).

Most of this is obvious, but you asked and II hope this helps.

1 Like

I recently made a test montage of stereo pink noise clips arranged as faux album tracks that have no space between them. If you play the montage from start to finish there are no disruptions.

You can download it HERE.

Then I added track markers and normal things I do when mastering an EP or album.

Then I can add various Clip Effects to the clips and make sure that they are actually processing the audio on renders, that there is no missing or corrupt audio at the start, end, or anywhere within the rendered audio.

For me, that’s the main thing. It’s also good to know if settings are retained when you close and reopen the Audio Montage, and if the GUI appears correct.

Sometimes the way certain plugins interact in a chain can be a factor too. For example, at one point, DMG Limitless was adding a short fade in to renders, but if you put nearly any other plugin before it, the problem went away.

I don’t use the master section for any plugin processing but if you do, it’s important to test plugins in the master section as well as in the Audio Motnage, and vise versa.

It’s also good to be aware if you’re testing and ultimately using the VST3 or VST2 version because I’ve seen cases where the VST3 version has a bug but the VST2 version does not, or vise versa.

Basically, just some simple (in my opinion) testing can go along way vs. just rendering files and sending them off without checking and listening to them and assuming the processing was applied, and applied correctly. This is especially true in mastering when we’re sometimes making EXTREMELY subtle changes that could be hard to detect on rendering or notice that they didn’t render.

Awhile back I was lobbying for a “null test track”. Kind of like a reference track but instead, a track that you could add to your audio montage, put your rendered audio on it, have it be reverse polarity by default, have it skip the Montage Output and Master Section and just reveal on playback and/or a new render if there are any differences between your master montage live playback vs. what got rendered.

It would be a nice feature.

1 Like

SOLVED :I have to let you know that once again I experienced a single random and obviously incorrect render.

There was a gain stage in the plugin chain and the rendered file looked and sounded like it only rendered that first plugin. The other three plugins … DMG EQ, Sonnox Limiter and Voxengo Limiter seem to have been ignored.

This was mystifying because I had been using this identical chain in other projects that day and over the days prior. Every render perfect.

The plugins were all VST 3 and DYN was not checked in preferences.

The plugs were inserted in the Output section of the montage (so no clip complexity). It was a simple single (one track) render. The montage was 32 bit float 96 kHz.

I was thinking that ‘something’ is causing one or more plugins to incorrectly report their ‘state’ to WL or WL is incorrectly reading that ‘state message’. Possible?

The PC is a relatively powerful machine, purpose built for mastering only i7 32 gig of ram with RME HDSPe AIO card sending AES only.

I’m on the latest version of WL Pro 11.0.30 build 114 and the latest build of Windoze 10 Pro.

WL 11 is super stable and otherwise works perfectly. I cannot recall one single crash in 11. Rock solid.

I have been a WL user since at least version 5 (full on since WL 6) and have never seen this behavior before.

I have checked and double checked all render settings. But, as I said, everything was working perfectly prior to this … even the session immediately before rendered perfectly with the same set of plugins to the same destination drive.

This behavior is rare and completely random and I cannot reproduce it. It is therefore not useful to suggest removing plugs one at a time etc. because the next time you render … like the time before … it will almost certainly be OK.

While it could be plug in related, it’s challenging to understand why it works correctly 99.9% of the time. It might be something I am doing, but I cannot think what that might be.

Maybe there is something going on in my system that I have not yet identified.

Potentially, the incorrect render issue could be very easy to ‘miss’ in a project where you are, for example, making subtle revision changes.

I can say that I am pretty sure it has only been happening since the update to 11.0.30.

These identical plugins continue to work 100% OK in WL 9.5

Any thoughts?

Hi!

Just guessing here…
Can this have something to do with buffert in plugins order
I mean one or more plugin can’t wait and override timing wise
other plugins thereby missing out in render !?

regards S-EH

Thank you for that thought.

My understanding is that the plugin processing is ‘linear’. So step 3 cannot be processed until the result of steps 1 and 2 are known and so on. Timeout should not therefore be an issue? It has been a while since I read Pohlmann’s Principals of Digital Audio though so I could be wrong.

I was thinking that ‘something’ is causing one or more plugins to incorrectly report their ‘state’ to WL or WL is incorrectly reading that state message.

Why does the same plugin chain render correctly almost 100% of the time though.

I can’t imagine a reason for this.
When this happens, if you do the operation again, without modifying anything, does it work this time?
What is the output file format?

To verify this assumption, can you ensure that, by redoing the render and muting the other plugins, and to a file comparison?

Try erasing this file:
\AppData\Roaming\Steinberg\WaveLab Pro 11\Preferences\PluginSettings.dat
and restart WaveLab.

1 Like

I have no idea either … I have closed and re-opened WL and it works 100% perfect.

The output file format is the same as that being rendered: 32 f 96 pre and 32 f 96 post.

When this happens, it is not always the first plug. I have had it only render say, two plugs. I have verified that in the past.

I will delete the preferences… I did not think of that.

Thank you for your thoughts.

OK … so I deleted PluginSettings.dat

This has definitely done something.

Before making my post I created a montage in 9.5. I then rendered that. I closed 9.5 and opened 11. Imported the montage and rendered that.

The renders did not null as they should. That’s when I decided to post.

I have just tried the same exercise and the files now null.

All I can think of is that somehow the .dat file has become corrupted.

Thank you for this insight.

As always I appreciate the thought.

2 Likes

hey @PG1
it sounds like the solution to my problem in this other Thread was the same but instead of deleting this .dat file I simply changed my preferences choices and voilá the bug was gone! :wink:

Nice move! o/

1 Like