Trigger Automation via MIDI

Hello folks

Crazy idea here.
Is it possible to somehow control an automation lane through aftertouch, or note-off MIDI messages?

I would like to trigger Roomworks’ Hold button through my MIDI keyboard.

Yes…

  1. Make sure you have an instance of Roomworks loaded into a project.

  2. Make a new Generic Remote Device and set it to listen to whatever MIDI port your controller uses.

  3. Bind a MIDI event from a key/mpc-pad/button/pedal/whatever directly to the plugin’s Hold Button using the Generic Remote map. It’s also possible to bind to ‘relative DAW controls’ on the automation lanes themselves, instead of directly to a plugin instance, just be aware if you do that and change track order at some point, it can get confusing. That’s why I suggest binding directly to the plugin. If you use more than one instance of Roomworks, be aware that the map will bind according to the order you load the plugin instances into the project (instance 1, instance 2, etc.).

  4. Export a back-up copy of the new Generic Remote Device to a location of your choice. For some reason these maps don’t always stick unless you do this. No idea why, but if you expect the map to be there from now on…back it up.

If you also want to be able to drive the map with/through a MIDI track, assign the track’s MIDI input to come from your controller, then use a Virtual MIDI port on the track as the output, and assign that virtual port in the Generic Remote Map (instead of the actual MIDI port of your controller). If you’re on Windows, and don’t have a virtual port, I recommend loopMIDI. If you are on a Mac, you can build your own through the Core-Audio utilities in the OS.

(On using after-touch to work a toggle based control…you’ll probably need a MIDI transformer in a MIDI track for this…I.E. If After touch is greater than 64, send a note-on or CC event to the Remote Device Map…If less than 64, send a note-off or CC event, etc. see next paragraph)

Personally, I find it VERY useful to set up some virtual ports and route things through MIDI tracks. That way I can even record and edit the realtime movements of my controllers. I can put MIDI transformers (will come in handy if you want to use aftertouch), monitors and more in the track inspector’s insert slots, and re-route things to even drive more than one Remote Map if needed. One of my favorite things about this method is that when working with MIDI tracks, we have all kinds of interesting real time record cycle and take management options that we don’t get with straight up automation lanes. I.E. You could loop a few bars, leave the transport running, and take several recordings of your controller movements, that’ll automatically go to fresh tracks on each cycle…then sort through it later and use ‘the best one’. Plus, you can open up the key-editor, and apply MIDI Logical Editors to the CC events stored there. You can even cut and paste the CC events directly into a VST automation lane in later stages of the project if you like (freeze it into the VST automation lanes, and get rid of the MIDI tracks).

If the Generic Remote Device you’ve created ever gets in your way…you can just set its MIDI input to NONE to disable it until you need it again.

Finally, be aware that Generic Remote Devices don’t save with a project. Instead, they are a local part of your Cubase settings (on your computer only). So, if you’ll be sharing the project over different computers, be sure to make a note of it and drop a copy of your Remote Device back-up in the project folder before transporting it to a new system, so you can easily ‘import it’ into other instances of Cubase on new target systems.

1 Like

P.S.

Depending on your plugin it might also be rather quick and easy to use a Quick Control instead of a Generic Remote. It’s worth checking out.

I posted the Remote Device method first, because that route should work with pretty much ANY plugin…always, and you can make it a more static thing that’ll always work if you like (even if the track isn’t armed). In contrast, the Quick Control method can require a plugin be designed a certain way to support all the VST protocols, plus the track has to be armed/selected/in the foreground, and the controls properly snapped/activated, which isn’t always the case scenario for every plugin and work-flow need.

Sometimes you want the control to work…no matter what other stuff you might have armed and pulled up to the front of your workflow at the moment, so a Generic Device would be better in those cases.