For ages I’ve been trying to find a good work-around for using CC from a MIDI-track to control VST-insert parameters.
This would, for me, have loads of creative uses, especially making generative music with MIDI-LFOs, Sequencers and Arpeggiators in conjunction with MIDI-transformers to make MIDI-Notes into CC.
For example, I would like to control the Time-parameter on a Delay-VST using the Auto LFO-MIDI insert.
The most common way is to route out a MIDI track with AutoLFO to a virtual port and then back in to the quick controls via virtual port (LoopMIDI etc.). At which point you may as well just be modulated externally or by more capable plugins like MIDIShaper, or those that wrap other VST’s and modulate internally.
So yes, there’s plenty of ways to do that, but they become quite involved and you’re limited to focusing on one track at a time using the quick control route.
You’d also have to make sure the LFO tool stops sending when other tracks are selected or you’d be modulating the wrong quick control parameters.
i.e. the basic setup would be:-
MIDI TRACK > Auto LFO Insert > Virtual Port
Virtual Port > Generic Remote/Quick Control mapping
Would give you something like this:-
And then you’d use that MIDI Track to control the modulating parameters, and write the automation when you’re happy with the results.
The other method would be to create generic remote mappings that hard link to VST parameters in specific mixer slots, trouble with that is that the generic remotes cannot be loaded per project - so very hard to manage.
If you want full step by step instructions and you’re on windows I can happily show you.
It’s insane that the core of Cubase is entirely MIDI CC driven still. There’s no form of VST parameter modulation, and you can’t even draw LFO shapes into the automation lanes.- which would be a basic improvement.
The thing is, if you use the same instruments all the time, or for a whole album or something, and your controller has paging, and of course the generic remote can have paging as well, then generic remote mapping not being per project isn’t such a bad thing.
The Novation Launch Control LX does not have motorized faders, but it has 8 faders, 24 knobs, and 16 buttons. With 8 separate programmable pages in addition to a sperate port for HUI. It’s light cheep and portable. It has buttons for paging the device, and separate buttons for paging the Generic Remote. That is more than enough for an album’s worth of control. It’s not modulation, but if you stick reactor in the mix that can be done though reactor. Interface wise it isn’t that different from what you get in bitwig. And setup wise, well, that’s just the hardware side you would have to manage anyway.
Steinberg making the generic remote settings easier to do would be more valuable… But adding modulation would get rid of the need for reactor.
I don’t know. I’m going to make music today and nerd out again next week.
Reaktor has MIDI out, and you can create MIDI modulation within reactor. You can also write scripts in Kontakt and make GUIs to control the script, and link that to your hardware CC. A large portion of what NI has to offer, in one way or another, started out as Reaktor or Kontakt. The interfaces are really dated, and the scripting language is horrible, but it get’s the job done. Not as slick as what Bitwig has, but it works.
I have done the Kontakt script solution, and I have Reaktor, but I haven’t tried exactly that yet in Reaktor, I have to many little projects…
Obviously something just integrated would be optimum.
Also likely need some sort of VST Universal Control protocol, maybe in VST4, where part of standard becomes that all controls in a plugin are accessible to some sort of universal control matrix that is part of the standard SDK.
I think if one VST3 plugin could see the parameters of the next VST3 after it, would really open up some third party modulation possibilities too. Across all DAW’s, and would push the VST standard for sound design.
Right!? It doesn’t even need to be standard in VST+ (VST4 … we don’t talk about VST4). Bitwig somehow allows for any control in any plugin to be “learned”, and it doesn’t require a MIDI track, just a dedicated port for that operation. So it doesn’t have to be in the VST API. I don’t have Bitwig, so I am trusting the demos I have watched. There may be some technicality I don’t understand. In addition to that, since they have everything able to be modulated, any LFO can be used to control any parameter (sort of like how you do it in Pigments).
automation were treated like midi data, or just IS MIDI 2.0 data (DP?)
every parameter was learnable whether it was coded that way or not (Bitwig?)
automation were just the same as the rest of the midi data and could be routed and comped.
modulation routed through to a separate track to control a parameters, like in pigments. (Bitwig?)
every time midi went in to a track it also came out of the track, whether the VSTi altered it our not as if it were a separate port (like what happens with kontakt). so you didn’t have to use the 4 limited “midi sends”
I paid for that and tried it for audio between apps. It uses the systems IP, so it has some performance issues. I haven’t tried using it as a virtual midi port within the DAW. That might work better.
What would be really cool, is a VSTi that accepted audio through side chain, and provided MIDI out. You could use any audio signal like it were VC and then use that to modulate MIDI. I’m surprised I haven’t seen that yet. I guess you could create that in Reaktor… actually…