Automating multiple synths with Midi CC control/automation

“Are you trying to move stuff in more than one instrument/track at the same time?”
YES!
With a midi send channel.
Only the synth with “Midi Learn” can do this. The Quick Controls and the Generic Remotes can’t do this
I want to change all 3 synth’s cut-off at the same time with Pot 01 and record/play the automation in the midi track. I don’t want to have 3 rows of automation. Just one, that control 3 synths

Like in the video

OK, and you say some of the synths list no VST Parameter (have no lane, no way to ‘learn’) for what you want to do, correct?

What plugins are these (In case I have them and can look around)?

Do you know the CC the ones that aren’t working expects to drive that control (if it can)?

SPIRE (by Reveal Sound), has midi learn (one used in video) and it works
NEXUS 3 has none
JP6K has none
SYLENTH1 has midi learn

With JP6K, the CC74 drives the Layer volume, and nothing i move on my controller moves the Filter or Resonance
PS: The JP6K is THE BEST recreation of the Roland JP-8000 and it’s cheap

OK…study up on NEXUS and JP6K to see if they have a controller assigned to that thing you want to control, or a way to do it internally.

Spire and Sylength…we should be able to get those two working.

Have them both learn the same CC and channel. Arm both tracks for monitor or record, make sure their track inputs are listening to your Controller…and when you move the pot/fader. Both ‘should’ be responding.

I just grabbed the docs on the JP6K, it doesn’t say how to do automation at all, and I was going to grab the demo, but it says 'automation disabled in demo, silence very 60 seconds, etc…

It mentions automation, but doesn’t say anything about how to do it! Right clicking a control and choosing ‘learn’ is pretty standard for DAWs, if that does not work, I’m at a loss.

I still can’t find the docs for NEXUS…searching.

The JP6K developer says there is no midi learn. He uses FL Studio and FL studio can make ANY synth midi learn (I’ve tried and it does).
I’m sticking with Cubase because of the sound quality, but this issue of not being to operate my layered synths like one instrument, is putting my whole mojo in the dump :frowning:

Thanks for the help. I need to go to bed :slight_smile:

Cubase should be able to do so as well.

If it’s registering a VST Parameter, there should be a way to get a lane on it. It didn’t show up in the “Remote Control” editor?

This means you could hard-bind a CC to that control with a Generic Map. Record the automation on the lane (R/W buttons again).

All the parameters show up in the “Remote Control Editor”. I CAN map any parameter to any of the 8 slots in the editor and then map the fist slot to a button on my controller.

BUT

you can’t access that button from the midi send channel. I can’t connect CC74 (for instance) to the Filter. The midi send channel sends CC data but I can only access a specific parameter via quick controls

OK, lets try hard binding it to a Generic Remote.
Keeping it simple to begin with. A fresh new map.

I don’t have those synths, but here’s an example with Retrologue.

In the lower panel I found the “Device/Plugin” I want. In this case ‘retrologue’.

Over in the Value/Action field, I get a pop up that lists all Registered VST Parameters for the plugin, in this case I picked “Osc 1 Level”.

If the parameter you want to get at isn’t in that pop-over list for “Wake/Action” I’m at loss! It’s not a properly registered VST parameter! That guy needs to properly register it and put a learn button in his plugin, or in the least, set up some ‘default automation CCs, and list them in the docs’!

In the Top Panel, Learn a control (I got it on Mod Wheel CC1 for this demo).

For MIDI Input, I got it set for the MPK2 port my pot (K1) is using (Goes through a Virtual port I call Bome in my case).

Now when I move my Mod Wheel, the control moves as it should in Retrologue.

At this point, anytime I sent a mod wheel, it’s going to move this control. Doesn’t matter if a track is armed or not. As long as this plugin is loaded in this instanace order, Generic Remote hears CC1 Channel 1, it’s gonna pump.

So, how do I record these movements?

I find the lane for it in the project view, here I can record what that parameter is doing.

Oops, I got an extra copy of some useless CC1 events up there in my MIDI Track. I don’t need these, since in this case we’re doing a ‘cheat’ of sorts, we don’t have any choice but to keep the automation for this particular control in a VST automation lane. If the plugin had a learn feature, it’d be no problem…oh well. (I lied about no choice, but the alternatives get complicated, and involve using a Virtual port, some empty MIDI tracks to route things around…etc. Probably not worth it for this application!).

I could use a MIDI Transformer Global/Track/Insert to filter it out all together at the input so it doesn’t get recorded in the MIDI/Instrument track in the first place, or just use a logic editor to remove all the CC1 events. Or set things up so the Generic Remote stuff is done over a different MIDI port…it wouldn’t get recorded in the MIDI track then either.

Next problem…the other synth that can’t learn a CC.
Since you want the SAME CC Pot working two plugins at the same time, we’ll need to make a new Generic Map. Repeat the same process as above to set it up. For future reference, I personally call this ‘Stacking Maps’.

The reason we can’t just do it twice in the same map, is because if there’s more than one entry bound to the same control in an identical fashion, Cubase will only accept and pump the ‘first’ one down the list and stop. Not sure why, or if there’s a switch in there somewhere to change that behavior. What I do know, from trial and error, is if we make a totally new map for it, it’ll work…both plugins will get automated from the same CC event.

OK, next mess to sort out is…How do I STOP this behavior when I want that control for something else?

You could make a blank preset in the Generic Remote maps that don’t do anything. Bind them both to a macro or key command, so it’s easy to toggle that map on/off as needed.

You could set just it up on a different port/MIDI channel, etc, use the presets on your MIDI Controller and be sure not to use it for anything else.

You could Just set the MIDI input for a Generic Remote to none when you’re not using it.

You could use Global or Track MIDI Transformers to make a filter on the other tracks so they’ll always ignore this.

Bump for scads of edits in my last post.

Hi,

Hardcoding is off the table because no matter where I’m busy in the project, that parameter changes when I turn knob 01. That’s Ableton’s way of midi mapping and it’s horrendous.

Quick Controls is still the best of a bad situation. At least when I set it up like in the two pictures, I can setup each different synth to use the first slot (QC 01) in the Remote Control Edit window, to use which ever parameter i want and then when I select the track, Knob 01 always adjusts Cutoff.

It’s only the “Midi learn”-capable synths that can interpret midi CC from the midi send channels .


I’ve setup two VSTs, one with midi learn and one that can’t. Even though I can use quick Controls to manipulate the “dumb” synth’s knob with the Controller Pot that sends CC74, the CC74 that I record in the midi send track only affects the “midi learn” synth even though both VSTs receives the data (midi LED on vst).

The Quick control doesn’t connect CC data to a parameter (like midi learn), it only assigns Hardware CC data to Quick Controls and then Quick Controls to VST parameters. The CC data that comes from the midi send automation, does NOT go through this double bridge, unfortunately. Quick Controls doesn’t pick it up at all and the “dumb” synth is too dumb…

To make VSTs “midi learn” directly, the VST needs to be updated or Cubase needs to implement a midi learn like FL Studio into the VST “wrapper”.

I think we can close this discussion. The synth developers are hard ass about implementing this and Cubase might still be light years away from something like this.

Brian, thanks for all your insight! I’ve learned quite a lot with this discussion and I think others as well. Let’s hope the Cubase developers actually read these forums to see what the issues are that we experience, not just bug fixes :grinning:

I do understand the reluctance here. If you don’t have a lot of hardware controls at hand to ‘dedicate’ to a single purpose full time, it can get in the way…too easy to touch the wrong thing at the wrong time.

There is a kludgy work around for this too. It can also lead to automating some things in the DAW that don’t have native automation lanes at all (I.E. firing off a macro, moving locators, stopping starting the transport, arming disarming monitor/record on tracks, etc.), getting some ‘cycle and version’ action going on for you, and the best part is, you can use logic editors to ‘batch process’ MIDI tracks, and you can’t do that with the VST automation lanes.

The draw back to what I’m about introduce, is the resolution isn’t very high, so it won’t be ‘tightly precise’ for everything, or uber sensitive when dealing with microscopic intervals of change. Then again, most MIDI faders/pots/pedals aren’t very precise and high resolution either! So does it matter? I think you’d have to be using some of the high end stuff that uses 32 and 64bit instructions (RPN/NRPN combinations), or even sysex based for it to make a difference.

Another thing, is the effects you build using this technique will work in real-time, but won’t ‘instant render’, or ‘silent mode quick-mix export’…unless you export the stuff to VST lanes, then copy and paste them where they need to go. OR, do a live pass and let it record to those lanes.

Here’s how it works.

First, you get a virtual MIDI port. I like loopMIDI myself because it’s free/donation ware, easy to add/remove new ports anytime you like/have as many as you want/name them anything you want, but you could use whatever you want. loopBe, MIDI Yolk, etc. On a Mac…I think you can set them up in core-audio without installing anything extra.

Make a folder in your Project called ‘MIDI Routing’.
You’ll use this folder to store MIDI tracks…most of them won’t have any parts/events on them unless you wish to live record your controller movements on a MIDI part, or draw some automation stuff in with your MIDI editors.

Make an empty MIDI track called something like “Tri Synth Controls”, whatever works for you. Color code it, whatever ya like to make it easy to find.

Bring the MIDI controller device you want to automate directly into this track.
Set the channel to ANY (or to what your bound Generic Remote needs) and the end point (output) of the track to a virtual MIDI port.

Set up a track MIDI Transformer for the Tri-Synth Router track, and filter out all events you don’t want going onto this track.
I.E. if event-type != CC && value 1 != 74 then delete it.

Now, when you set up a Generic Remote, instead of having it listen directly to your MIDI Controller device, chain it’s input to the virtual port.

So now the path is:
MIDI Controller > Tri-Synth Controls Track > Virtual Port > Generic Remote

Now, anytime you wanna work with that hard bound control, all you gotta do is punch the monitor button on the Tri-Synth Controls Track. Toggle the respective R/W buttons for the automation lane’s to record the stuff as needed.

When you’re done with it, Toggle it off, and the Generic Remote map is ‘dead’ in terms of responding to your controller movements; however, anything you’ve ‘record’ into this MIDI track will still work.

Yep, you can record those fader movements on this track too if you like, take advantage of the retrospective recorder, MIDI Editors to draw in events, batch process stuff, drag/scale things around, use the ‘nudge tools in MIDI editors’, ‘cycle-record’ modes and stuff that MIDI tracks offer. You could set loop points, start the DAW cycling, and do all the fader movement takes you like…

You can also use the MIDI Processing Inserts on that track. I.E. Use the MIDI LFO Insert.

Main thing to remember here is…stuff you record in the Routing track will work fine in real time, but…if you want to instant render anything that uses automation stored on the MIDI Routing tracks you’d build here, you’ll need to convert/copy/paste to real automation lanes first.

Personally, once I’m done with the MIDI versions, I make a habit of freezing those tracks, converting moving the CCs to automation lanes, and copying them over to the proper VST automation lane. This way off in the distant future, the project will just work right…even if there are no ‘Generic Remotes’ set up on that Cubase instance at all.