Use MIDI controller to control plug-ins like fabfilter

Hi folks,
I am having a heck of a time trying to figure this out. I have a Kontrol-25 keyboard with 8 V-pots. My goal is to open a VST plug-in (like Fabfilter Pro-C2 for example), and then assign threshold, ratio, attack, and release to the first four knobs.

I want to be able to open any instance of that plugin and when the focus is on that plugin be able to control those four parameters with the pots.

I’d like to then be able to do the same thing with my other plugins so they are all pre-mapped and ready to be controlled when I open them.

My understanding is that this is something that can be done in Cubase-correct?

OK, so if it can indeed be done, can anyone please help me with that. I have figured out how to create a generic device, and also figured out how to assign the v-pots to control both track quick controls and VST instrument quick controls.

What I don’t understand is how to assign the V-pots to control pots 1-8 on the “Remote Control Editor” for the VST plugin themselves. How do I do that? Thank you! To be clear, I do NOT want to use Quick Controls to do this…Thanks!


Use the Focus Quick Control concept, please.

Is that limited to 8 controls? Or can I map more than 8 pots and faders using this method?

If it is 8, how do we get round this limitation so we can control plug-ins properly?

The FQCs (focus quick controls) give you 8 controls for anything in focus.
The TQCs (track quicks controls) give you 8 controls for the track that is in focus.

If you set the QC on the plugin using the QC icon at the top of the plugin, then whenever you open that plugin it will still be the same. These 8 are generally enough in most cases. You can even temporarily
lock these controllers to a given plugin instance.

When they are not, you can make the track quick controls also set to 8 more. But these 8 will be set on the whole track.

You can also make assignments to additional controllers either globally, or on a project by project basis. Unfortunately, those will be bound to the instance.

Or, you can route the MIDI through a MIDI track. Until recently this has been my go-to way of doing anything MIDI in Cubase. Basically you are back to concerning yourself with the MIDI at the hardware level in every case. You can route the MIDI from a MIDI track to the track with the plugin on it. And do a midi learn to set the controller to the desired knob etc. on the plugin.

Unfortunately Quick Controls and MIDI tracks are currently incompatible.

Since the Midi Remote in Cubase 12 I have found that for instruments I get 16 controls which are far more than I ever need.

On Audio tracks and plugins I don’t find that I need more than 8, and that I have plenty of track controls free because many of the other features I was mapping to Track Quick Controls now have their own dedicated knob.

Issue are: Channel Strip settings other than Pre and EQ, but it does seem like those will be added soon.

It would be amazing if we could get a 24 FQCs rather than just 8, and it would also be great if we could get QCs to work on MIDI tracks.

In addition it would be great if we could map controllers in the Midi Remote to specific CCs or Pitch Bend etc, so we could continue to use MIDI tracks effectively.

So that is the abilities and limitations as I can describe them.

1 Like

It would be amazing if we could get a 24 FQCs rather than just 8, and it would also be great if we could get QCs to work on MIDI tracks.

Or if we could get FQC’s pages so we can control more than 8 parameters in banks of 8.


That’s a very detailed description, thank you.

I want to be able to set up my MIDI controller to be able to control fabfilter pro-q3, as this is a plug-in I use extensively.

I want to set up a default patch with 6 bands activated and then map 18 pots on my MIDI controller to control three parameters per band (frequency, q width, gain)

The aim being that when ever I load up an instance on pro-q3, it opens with the default patch and I can IMMEDIATELY hop on to my MIDI controller and move pots with both hands to control the 6 eq bands that are set up (much closer to hardware experience)

If there were 18+ FQCs available, this would defo be possible. Are you able to let me know how I can do this with the 8 FQCs and other methods to get over the limitation?

Unfortunately no. For similar reasons I have not treated tracks as channels, but instead have a whole complex workflow that involves track archives rather than track presets, and includes a special CC MIDI track for each channel. I then load a channel archive which gives me the pre-mapped MIDI track, and when I select that channel through a set of very complex Project Logical Editors, that MIDI track becomes monitored, so the pre-defined CCs go to the right place.

Basically it’s building my own 8 track recorder on top of Cubase to get around workflow inhibiting limitations including not having the ability to control more than 8 quick controls.

I want to stay focused on what I am doing, not how I am doing, it so I completely separate the setup from the music making.

Basically you want what I have made for the built in EQ in the channel strip, with the 4 rows on the left:


There are requests right now to the MR designers to be able to script the same sort of thing for any plugin by using the name of the plugin.

This seems really intuitive and easy until you think about how compression is often used by Metal Engineers where they use multiple passes of EQ and compression. Personally I think it’s silly, and if I were going to do that, I would do it in multiple FX tracks rather than multiple instances of the same chain on the same track. But never the less, the issue with the request to be able to map controllers directly to a given plugin type on the selected track is inhibited by a concern for this edge condition. And it would still require scripting to make it work.

Still, this may become a reality soon, I hope.

At the moment you are left to create a template for what you want with an audio track and a midi track together.

Thanks so much again for getting back to me. I am the same as you in that I don’t want to be distracted when I am actually doing music, I don’t mind a bit of set up time before.

Are you able to point me in the direction of some sort of tutorial that explains the complex project logical editor sequences you are using that automatically ‘monitor’ the midi track associated with the track you are using so that the midi controller is armed and ready to go?

I wouldn’t mind exploring this myself to set up for one or two plug-ins.

Also, who are the MR designers? That sounds interesting. It seems a bit oversight to have all this FQC ready and incorporated into cubase since 2014 or earlier and then limit it to only eight pots!

Could you save tracks presets with Pro Q3 insert and Quick controls set as desired? You could add a midi remote page dedicated to that and another page with common commands.
I didn´t test it, just thinking…

1 Like


Thats the easiest idea. But you only get 16 total possible controls that are “class level” rather than “instance level”.

To get more than that you have to go through midi. You can make presets for that too, but it starts to get complicated really quick. Not only that, but the dev for the plugin has to make MIDI Learn available. They have to specifically code for it.


I have been doing some testing this morning, and since MRs C12 it seems that some plugins which use to work with MIDI Learn no longer do. So that’s even more of a bummer.

The instruments and amp sims all work, which is most of what I use that way, but most of the effects for mixing do not!!!

Never the less what you do is:

  1. Make a folder
  2. Make an Audio track and put it in the folder
  3. Put your plugin on that track
  4. Make a MIDI track and put it in the folder
  5. Route the output of the MIDI track to the plugin on the audio track
  6. Monitor enable the MIDI track
  7. Open the plugin and right click and select MIDI Learn or Enable Midi Learn
  8. Move the knob or whatever
  9. Repeat for all of the controls
  10. Select the folder and the tracks
  11. Select file->export->track archive, give it a name.

Now you can import the track archive and the mappings will still be valid.


  • This doesn’t work with track presets. I don’t know why, I just never got it to work.
  • If you duplicate the tracks it looses the mapping. Not just in the duplicate, but also in the tracks you duplicated! I don’t know why.
  • You can map the same knobs etc. to different plugins but if the associated MIDI tracks are monitor enabled they will both be attenuated. To get around this you need Project Logical Editor settings to arm midi track when you arm the audio track, and to do that it starts to get complicated real fast.

Being able to map the same CC is really useful for Guitar Sims though. You can’t do that through the MR.

So this really only works for In The Box Recording where you have a set number of instruments and are recording those specifically. Its like using a foot controller like it were a pedal board sort of thing.

It is NOT a good solution for mixing!

But here is the link where I shared the setup. Most of what you would need is not the attempt at non linear, but the PLEs which are in the presets zip.

It’s too much for what you want though.

It’s not “ready for prime time” or anything, but its what I do when I am recording.

Amazing, thanks for your help. I will take a look at all this when I get a minute.

One thing I’m curious about is how something like Softube Console 1 has instant control over its plug-ins and some UAD plug-ins with more than 8 pots…

I’m not in the UAD ecosystem in the slightest, and those who I interact with are not either. That is a sort of a Mixing community, rather than a Producer, Musician, or Composer, sort of ecosystem. (Go ahead and chime in if you disagree, and I know there are plenty who will, but this is my bubble, and that is the point, to point out my bubble, not to make some blanket statement, that is easily proven short sighted.)

Softtube and Ni on the other hand, I don’t have system level insight into. The insight I have with Cubase even is a guess based on over 20 years of building real time systems. It’s just a guess, and it’s often wrong.

That said, I think it has to do with the driver built in to the software and the hardware. The hardware is plugged in and says “look at me”. And the software sees that and says, “oh yea, we can talk to each other, so here is a message you might be interested in.” And the hardware just blindly responds to that message as being a valid message that it should be getting. Just like when you write an MR script it has to provide a filter for what device it expects to talk to like startsWith(“name of the device”). If it has the right name it just starts to talk to it, and do what the script says to do.

In the case of MRs, it is built in to Cubase to let you have a script like that. You could do the same thing, and write a script in Python or Java or C++ or Lua or whatever, and it would just check the MIDI ports and find what it’s looking for and start talking. But it won’t have any way to then talk to the DAW because the DAW hasn’t provided a way to talk to it. (Actually before MR there was Mackie etc. and you could write an app and run it on the machine and have the same sort of thing happen through much more complicated means.)

With VSTs they have their own software, so they can provide the interface themselves and code their own software to talk to their own device.

The key is, they have their own API within their own software for that. What a protocol like Mackie HUI etc. does is provide an API that both the DAW and Script/Hardware are capable of communicating with. It makes a generic language for both to communicate in. But those never really lived up to their promise of being generic. You really still need to know what the actual implementation of the DAW is, and recognize that this is what DAW you are connected to, and then respond in the appropriate way for that DAW, and that makes the whole thing much more complicated. Which is why the integration with a DAW always seems so limited. You get a CC121 and it works with Cubase really well, but a generic surface will integrate, but not in a seamless way, unless they purpose built that integration for that DAW, like Arturia or NI have done, or a Launchpad and Live or Logic.

Does that make sense? I might not be 100% correct here, but that should fill you in to the point of my meager understanding.

Well, if you read that book, then maybe you want to hear about what I think about how MRs could handle this situation intelligently, without burdening the Script writer too much, and let there be an interface through the UI MR designer. Because that is what this book is about below.

Say you want to control Fabfilter pro-q3 with an MR. There is this concept of Object Oriented User Interfaces. This is distinct and different than Object Oriented Programming. In fact the naming similarity has a history that is inverted from our modern expectation. (PM me for references I don’t feel comfortable name dropping here.)

If you think of the pro-q3 as a Class, and each insert as an Instance then it becomes a matter of Focus. (And I am hoping that is where things are going here, because of the Focus Quick Control name.)

If you select a track and it happens to have 1 Pro-Q3 on it, then , that is the plugin under Focus. If there are more than 1 insert instance of the Pro-Q3 Class then you have to have some way of making the 2nd or 3ed etc. the one “In Focus”. A next/previous, or the user selects it with the mouse or arrow keys or whatever.

Whichever one is “In Focus” is the one being controlled.

In this way you could go to an instance of Pro-Q3 and right click and make assignments to whatever surface elements on a given page you want. Then it would be bound to the Class of Pro-Q3, not the instance of Pro-Q3 you used to make the assignment with.

This would greatly simplify everything, when making bindings through the UI, and would match the intuition of the user!

“I made this knob control the knob on Pro-Q3 and now it controls that knob for all Pro-Q3s depending on which one I am looking at., or focused on.”

It wouldn’t require any sort of list retrieval or filtering in the scripting. It would just be whichever instance of the given plugin was “In Focus”.

This could be achieved by providing more than 8 FQCs and it makes for a nice tidy way of defining directly which controls are mapped to what. But it requires indirection by the user!!!

The user has to think through which controls they are assigning to FQCs and how their layout maps to FQCs etc.


Why not map directly to the Surface already defined rather than some extra concept of Quick Controls?

This, would completely alleviate the need to create extra MIDI tracks just to pass controls along to an instrument or plugin. Well, it would if one could also define expressions through the MR as well!

So, what about expressions?

That is where the Idea of a Secondary Mapping would come in more handy than it does with quick controls. There is something secondary to map to rather than just an enumerated list of controls.

So the user defines a control on a given page as controlling Vibrato. Done, now it’s up to the VST Instrument to provide the appropriate mapping to The Vibrato Interface Class. And if they don’t then the extra controls for a given Class of Expression can be defined by the user.

The user’s experience would be:

“I want this control on this page to control Vibrato”.
Then they open a VST and move the fader they assigned to “Vibrato” and it works.
If it doesn’t work, then they can select the UI element on the VST UI that says Vibrato and select “Assign to expression” and select “Vibrato”.
But you say, there is no UI for vibrato in the VSTi.
That’s fine, you select the Exp icon at the top of the UI window right about where the F is now, and find Vibrato and assign it to ,(say it’s Spitfire?) ok that’s CC21. Done.
Now your “Vibrato” assignment in the MR will send Spitfire CC21 when Spitfire is In Focus, or when Focus is locked.

The complaint here is, well, what about “Tightness”? How do we assign that? Spitfire doesn’t even have tightness in every one of their instruments, Tightness is kind of unique, and it’s something different elsewhere, like maybe " Pizzicato Length" or something weird like that. Well. Pick a name, or better yet,

Let the User define the Expression Class and then define one or more instances to it.

You only need to know that Spitfire is what you call " Pizzicato Length" is CC18, not that it’s called “tightness”, and maybe they want to map more than one expression as well. Fine, let them pile it up as much as they want.

The point is, they either pick from a pre-defined traditional set of expressions, or they define their own and map a control to that expression. Then map Expression Classes to one or more specific controls or CCs for a given instrument.

For Amp Sims you have “Wah”, and “overdrive” etc. Either mapped automatically by the Amp Sim manufacturer to the available Expression or created by the user, as a mapping to a control already assigned to that Expression on their Midi Pedal.

With this there would be no need at all for those extra MIDI tracks all over the place, requiring so many extra PLEs to make sure the focus is correct. And it is really intuitive and easy for the user, who only has to occasionally think about what CC they are actually using when they do the initial mapping, and most of the time, they won’t even care about the CCs at all.

1 Like