VST3 hosting: when to use IEditController::setParamNormalized?

In the case of a host generic UI for a plugins parameters, if setParamNormalized is called, some plugins will only update their UI and not the internal sound engine. So, it seems the correct way is to always queue parameter changes for dispatch in the audio thread instead, even when these changes are UI thread edits.

If that’s the case what is the purpose of setParamNormalized on the editcontroller? Should a host ever call this, and for what?

(I do hope the answer is not the host is supposed to call this as well as dispatching parameter changes in the audio thread… surely the plugin is responsible to manage its own internal state and reflect this back to its own view. The view should be a black box to the host imo)

You call IEditController::setParamNormalized if you change the parameter on the host side, for example when automating parameters, or when providing your own controls for the plug-in parameters like a generic editor.

Ok, that is what I expected. But as explained, some plugins (kilohearts is a known example) will not update their sound engine when this is called.
So either
(a) in the implementation of a generic editor, the host should call this AND queue parameter change events in the audio thread
(b) plugins such as kilohearts are buggy
Could you please clarify if A or B is correct? Im not clear on what the correct implementation is.

(a) is the correct way. The host is always responsible for queuing realtime parameter changes and provide these changes to the IAudioProcessor in the process call.
There are some bad designed plug-ins that shares their parameter state in the edit controller and audio processor, thus they may work if you just set the parameter of the edit controller, but this is wrong.

The normal way for plug-ins is that they call the host when a parameter is changed in the UI thread via IComponentHandler::beginEdit/performEdit/endEdit. The host then sends these changes via IParameterChanges to the IAudioProcessor in the IAudioProcessor::process call.

If a host implements a generic editor, the host has to send the parameter changes to the audio processor via IParameterChanges in the IAudioProcessor::process call, and also to the edit controller via IEditController::setParameterNormalized.

So if the host automates the plugin, its also supposed to update its view with setParameterNormalized as well?

Yes, the host should synchronize the UI (Controller part) with setParameterNormalized.

Note that a Host has to compensate the delay between the time the parameter changes are send to the AudioProcessor (which are processed in advance compared to the audio which is hearable at the speakers) and the Controller (which shows the state of audio hearable at speakers).

Just to clarify, because i’m still seeing inconsistent behavior across plugins: this means when automating (scheduling parameter changes in the audio thread), host should call both IEditController::setParamNormalized AND schedule parameter changes on Vst::IParameterChanges

Plus, when recording automation via a generic editor, i.e. from the UI thread, it should also do both. (So its assumed safe to call IEditController::setParamNormalized in the audio thread?)

It seems to me to get all plugins to work, both are needed, in both RT automation playback and UI writing automation cases, but then it begs the question whats the point of having 2 separate mechanisms?

let me explain again the different case:

Playing automation already recorded in the DAW:

  • In Audio Realtime Thread the DAW collects the automation parameter values changes according to a specific time range and send them in the process call with the according audio block.

*In the UI Thread the DAW sends to the controller the parameter values corresponding to the time position where the playback cursor is visualized (should be in sync with what the user hears at speakers)

Recording Automation from UI Changes (using mouse on the plugin´s UI or Hardware remote control):

  • the changes should come from the UI Thread with beginEdit, performEdit…endEdit, the DAW could record these changes in its automation track. No need to call back setParameterNormalized here when its comes from the plugin itself.
  • [when the changes are coming from a generic UI controlled by the DAW, the mouse change will call setParameterNormalized in the UI Thread]
  • these changes are scheduled to be send as soon as possible to the processor using the next available process call (in the Audio RT Thread).

This is important to understand the 2 parts of the plugin : processor (for processing) and controller (for UI).
The DAW could process a plugin 1 second in advance for example, but the UI (controller) should show/get what is currently hearable at speaker level: what you see/get is what you hear .