VST SDK: Failure in Validator in Release Build

I have an error when i build my plugin in Release Mode. It works fine in Debug Mode. Error occures in validator in checking Mono to Mono:

1>[In: Mono: 1 Channels, Out: Mono: 1 Channels]
1>Info: ===In: Mono: 1 Channels, Out: Mono: 1 Channels
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: Der Befehl "setlocal
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: “C:\Program Files\CMake\bin\cmake.exe” -E copy C:/Users/marku/Documents/GitHub/vst3sdk/cmake/modules/…/templates/VST_Logo_Steinberg.ico “C:/Users/marku/Documents/GitHub/VST3 Tutorial Projects/TestProject/build/VST3/Release/TestProject.vst3/PlugIn.ico”
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: if %errorlevel% neq 0 goto :cmEnd
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: “C:\Program Files\CMake\bin\cmake.exe” -E copy C:/Users/marku/Documents/GitHub/vst3sdk/cmake/modules/…/templates/desktop.ini.in “C:/Users/marku/Documents/GitHub/VST3 Tutorial Projects/TestProject/build/VST3/Release/TestProject.vst3/desktop.ini”
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: if %errorlevel% neq 0 goto :cmEnd
1>C:\Program Files\Microsoft Visual Studio\2022\Community\MSBuild\Microsoft\VC\v170\Microsoft.CppCommon.targets(166,5): error MSB3073: attrib +s "C:/Users/marku/Documents/GitHub/VST…

If i do a printf in my process methode i see that it runs a while without problems in Mono - Mono Test, but after a few seconds it stops and produce the error.

Here is the whole process Methode:

tresult PLUGIN_API TestProjectProcessor::process (Vst::ProcessData& data)
{
	//--- First : Read inputs parameter changes-----------

	//fprintf(stderr, "Beginn process: data.inputParameterChanges ");

    if (data.inputParameterChanges)
    {
        int32 numParamsChanged = data.inputParameterChanges->getParameterCount ();
        for (int32 index = 0; index < numParamsChanged; index++)
        {
            if (auto* paramQueue = data.inputParameterChanges->getParameterData (index))
            {
				//Vst::ParamValue value = 0;;
				//int32 sampleOffset = 0;
    //            int32 numPoints = paramQueue->getPointCount ();
    //            switch (paramQueue->getParameterId ())
    //            {
				//default: break;
				//}
			}
		}
	}

    //--- Here you have to implement your processing

    if (data.numInputs == 0 || data.numOutputs == 0)
    {
        return kResultOk;
    }

    // Annahme: Stereo-Eingang und -Ausgang
    int32 numChannels = 2;
	uint32 sampleFramesSize = getSampleFramesSizeInBytes(processSetup, data.numSamples);
	void** in = getChannelBuffersPointer(processSetup, data.inputs[0]);
	void** out = getChannelBuffersPointer(processSetup, data.outputs[0]);
	

	for (int32 i = 0; i < numChannels; i++)
	{
		// do not need to be copied if the buffers are the same
		if (in[i] != out[i])
		{
			memcpy(out[i], in[i], sampleFramesSize);
		}
	}


	return kResultOk;
}

Thx for helping!

If i use following code also in debug mode the “setlocal” error occures:

    int32 numChannels = 2;
	uint32 sampleFramesSize = getSampleFramesSizeInBytes(processSetup, data.numSamples);
	//void** in = getChannelBuffersPointer(processSetup, data.inputs[0]);
	//void** out = getChannelBuffersPointer(processSetup, data.outputs[0]);
	

	for (int32 i = 0; i < numChannels; i++)
	{
		double* inputChannel = data.inputs[0].channelBuffers64[i];
		double* outputChannel = data.outputs[0].channelBuffers64[i];

		for (int32 frame = 0; frame < data.numSamples; frame++) {
			outputChannel[frame] = inputChannel[frame];
		}
		// do not need to be copied if the buffers are the same
		//if (in[i] != out[i])
		//{
		//	memcpy(out[i], in[i], sampleFramesSize);
		//}
	}

i can use example code from adelay or again… nothing works in own project.

And this code compiles without error in Debug AND Release, but it throws an exception in runtime in the outChannelBuffer = InChannelBuffer line:

	auto inputs = data.inputs;
	auto outputs = data.outputs;

	for (auto channelIndex = 0; channelIndex < inputs[0].numChannels; ++channelIndex)
	{
		auto inChannelBuffer = data.inputs[0].channelBuffers64[channelIndex];
		auto outChannelBuffer = data.outputs[0].channelBuffers64[channelIndex];
		for (auto sampleIndex = 0; sampleIndex < data.numSamples; ++sampleIndex)
		{
			outChannelBuffer[sampleIndex] = inChannelBuffer[sampleIndex];
		}
	}

Failure:

Ausnahme ausgelöst bei 0x00007FFE51880EDE (TestProject.vst3) in VST3PluginTestHost.exe: 0xC0000005: Zugriffsverletzung beim Lesen an Position 0x000000000BEDF000.

"VST3PluginTestHost.exe" (Win32): "C:\Windows\System32\psapi.dll" geladen. Das Laden von Symbolen wurde durch die Einstellung zum Einschließen/Ausschließen deaktiviert.
"VST3PluginTestHost.exe" (Win32): "C:\Windows\System32\psapi.dll" wurde entladen.
Der Thread 7208 hat mit Code 1 (0x1) geendet.
Der Thread 3372 hat mit Code 1 (0x1) geendet.
Ausnahmefehler bei 0x00007FFF13F1502E (ucrtbase.dll) in VST3PluginTestHost.exe: Schwerwiegende Beendigung des Programms angefordert

So i try and try and nothing works

EDIT: Sometimes it works if i restart, sometimes the exception is throw. I see that sometimes when it starts in the first execution of processing the pointer from the OutputBuffer is 000000000000.

Here are more information from the exception:

Ausnahme ausgelöst bei 0x00007FFE34E60EDE (TestProject.vst3) in VST3PluginTestHost.exe: 0xC0000005: Zugriffsverletzung beim Lesen an Position 0x000000000ACAF000.

The SampleBufferIndex = 156, so 155 Samples worked well.

OK, I fundamentally misunderstood something. I thought for 64bit plugins I only need to use the channelbuffer64 and the plugin only needs to support 64bit processing. I’m confused now, but if I convert everything to channelbuffer32 then it works. The plugin also loads into Ableton Live 12. (In the Ableton log file there is the error "Plugin Processor does not support 32bit float).

BUT: Why does the plugin have to support and implement 32bit on a 64bit computer in a 64bit DAW???

I think you mix 64 bit for the binary (dll, exe) and the buffer precision for processing (float precision 32 or 64 bits).
For example a 32 bit app could process a plugin wit 64 bits audio buffer…

Thx for answer. I thougt it would be standard that 64 bit plugins (and Hosts) process also in 64 bit (better headroom etc) AND provide 32 bit for Performance reasons. But a little research shows that this isnt the case :slight_smile: