Using CC from a MIDI track to control any parameter (for example using Quick Controls)

For ages I’ve been trying to find a good work-around for using CC from a MIDI-track to control VST-insert parameters.
This would, for me, have loads of creative uses, especially making generative music with MIDI-LFOs, Sequencers and Arpeggiators in conjunction with MIDI-transformers to make MIDI-Notes into CC.

For example, I would like to control the Time-parameter on a Delay-VST using the Auto LFO-MIDI insert.

Is there any what I can do this?

1 Like

Yeah, it’s pretty ridiculous right now…

The most common way is to route out a MIDI track with AutoLFO to a virtual port and then back in to the quick controls via virtual port (LoopMIDI etc.). At which point you may as well just be modulated externally or by more capable plugins like MIDIShaper, or those that wrap other VST’s and modulate internally.

So yes, there’s plenty of ways to do that, but they become quite involved and you’re limited to focusing on one track at a time using the quick control route.

You’d also have to make sure the LFO tool stops sending when other tracks are selected or you’d be modulating the wrong quick control parameters.

i.e. the basic setup would be:-
MIDI TRACK > Auto LFO Insert > Virtual Port
Virtual Port > Generic Remote/Quick Control mapping

Would give you something like this:-
CBAuto

And then you’d use that MIDI Track to control the modulating parameters, and write the automation when you’re happy with the results.

The other method would be to create generic remote mappings that hard link to VST parameters in specific mixer slots, trouble with that is that the generic remotes cannot be loaded per project - so very hard to manage.

If you want full step by step instructions and you’re on windows I can happily show you.

It’s insane that the core of Cubase is entirely MIDI CC driven still. There’s no form of VST parameter modulation, and you can’t even draw LFO shapes into the automation lanes.- which would be a basic improvement.

1 Like

Yes, it would be great to see some kind of universal control matrix.

a “Control Editor” window.

I imagine this to be something like Reaktor, where almost every control of Cubase can be tied into.

For example,
right click on a fader > ‘Add to Control Matrix Editor’

That fader will now.
a.) Show as a control asset in a list of all assets that can be tapped into/out of.
b.) Appear as a block in the Control Matrix Editor.

The Control Matrix Editor would also be able to have its own modular blocks that exist on their own within the Editor.

I guess that answers my question.

The thing is, if you use the same instruments all the time, or for a whole album or something, and your controller has paging, and of course the generic remote can have paging as well, then generic remote mapping not being per project isn’t such a bad thing.

You can use reactor in the same way and you can rout that internally.

Yeah that’s right, or you just save generic remote exports in the song project folder, and import them in per song that requires out.

…Or Steinberg could give us parameter modulation, of course! lol :slight_smile:

1 Like

Not sure what you mean

The Novation Launch Control LX does not have motorized faders, but it has 8 faders, 24 knobs, and 16 buttons. With 8 separate programmable pages in addition to a sperate port for HUI. It’s light cheep and portable. It has buttons for paging the device, and separate buttons for paging the Generic Remote. That is more than enough for an album’s worth of control. It’s not modulation, but if you stick reactor in the mix that can be done though reactor. Interface wise it isn’t that different from what you get in bitwig. And setup wise, well, that’s just the hardware side you would have to manage anyway.

Steinberg making the generic remote settings easier to do would be more valuable… But adding modulation would get rid of the need for reactor.

I don’t know. I’m going to make music today and nerd out again next week.

1 Like

Reaktor has MIDI out, and you can create MIDI modulation within reactor. You can also write scripts in Kontakt and make GUIs to control the script, and link that to your hardware CC. A large portion of what NI has to offer, in one way or another, started out as Reaktor or Kontakt. The interfaces are really dated, and the scripting language is horrible, but it get’s the job done. Not as slick as what Bitwig has, but it works.

I have done the Kontakt script solution, and I have Reaktor, but I haven’t tried exactly that yet in Reaktor, I have to many little projects…

ah yeah, I see.

Obviously something just integrated would be optimum.

Also likely need some sort of VST Universal Control protocol, maybe in VST4, where part of standard becomes that all controls in a plugin are accessible to some sort of universal control matrix that is part of the standard SDK.

I think if one VST3 plugin could see the parameters of the next VST3 after it, would really open up some third party modulation possibilities too. Across all DAW’s, and would push the VST standard for sound design.

Sounds crazy to implement though. lol

1 Like

Right!? It doesn’t even need to be standard in VST+ (VST4 … we don’t talk about VST4). Bitwig somehow allows for any control in any plugin to be “learned”, and it doesn’t require a MIDI track, just a dedicated port for that operation. So it doesn’t have to be in the VST API. I don’t have Bitwig, so I am trusting the demos I have watched. There may be some technicality I don’t understand. In addition to that, since they have everything able to be modulated, any LFO can be used to control any parameter (sort of like how you do it in Pigments).

So if

  1. automation were treated like midi data, or just IS MIDI 2.0 data (DP?)
  2. every parameter was learnable whether it was coded that way or not (Bitwig?)
  3. automation were just the same as the rest of the midi data and could be routed and comped.
  4. modulation routed through to a separate track to control a parameters, like in pigments. (Bitwig?)
  5. every time midi went in to a track it also came out of the track, whether the VSTi altered it our not as if it were a separate port (like what happens with kontakt). so you didn’t have to use the 4 limited “midi sends”

We would all have everything we want. I think.

I was just saying that while you were LOL you beat me to it.

1 Like

Perhaps Blue Cat Audio’s Connector would make it simpler? I’m yet to try it, but it looks promising.

I paid for that and tried it for audio between apps. It uses the systems IP, so it has some performance issues. I haven’t tried using it as a virtual midi port within the DAW. That might work better.

What would be really cool, is a VSTi that accepted audio through side chain, and provided MIDI out. You could use any audio signal like it were VC and then use that to modulate MIDI. I’m surprised I haven’t seen that yet. I guess you could create that in Reaktor… actually…

Dang it! That’s another project to try.

2 Likes

You should try Bitwig. It can do all of what’s been discussed here - and much more! - without any workarounds, natively:

https://www.bitwig.com/learnings/an-introduction-to-modulators-45/

Yea, except I really keep meaning to get serious, stop doing every part in the song in one take, and do multiple takes and comp them like a professional, and you can’t do that with audio in Bitwig.

Bitwig 4.0 introduced audio comping.

shhhhhh!

we want these features in Cubase, right?

1 Like