@support please: MIDI routing

Dear Steinberg support,

I’d appreciate if someone from Steinberg can give detailed information about MIDI routing within one MIDI track, please.

  1. MIDI Track Rec/Monitor Live MIDI input
    a) → VST Expression Map Remote (filtered out)
    b) → MIDI Input Transformer → 4)

  2. MIDI Event Playback Note/Controller/etc. Data
    a) → 4)

  3. VST Note Expression MIDI CC-Data / VST Expression Map MIDI Data
    a) → 7

  4. Pre/Post Send switch → 7

  5. MIDI Inserts 1-4

  6. MIDI Track Output

  7. MIDI sends 1-4

    May be there is some graphical scheme or detailed documentation available anywhere?(Could not found anything in the knowledge base or in the manual)

    Thank you very much for your help here!

Hi, :slight_smile:
I’m not sure I understand your qeustion correctly. You would like to have an illustration of how MIDI signals flow in Cubase right?

Is this a theoretical question or is there a specific usecase related to it?


there are some issues related to yet “unkown” MIDI signal flow within a MIDI track:

A) VST Note Expression MIDI CC data
(see 2) in post: VST Note Expression - Cubase - Steinberg Forums)

a) If I have VST Note Expression MIDI CC data for a MIDI Note on MIDI channel #ch, I assume the MIDI channel # for the VST Note Expression MIDI CC data is always the the exact same MIDI channel #ch as the MIDI note?

b) The MIDI note and the MIDI CC controller lane data is being processed by the MIDI inserts before sent out the Track/MIDI sends. I can use for example a MIDI Transformer to alter the MIDI channel # of the MIDI CC data.

However, the VST Note Expression MIDI CC data is NOT being processed by the MIDI inserts, ie I can NOT for example use a MIDI Transformer to alter the MIDI channel # of the VST Note Expression MIDI CC data.

So I am wondering how the routing of all the different MIDI “sources” within a MIDI track is exactly… (Live MIDI Input, VST Expression Map Remote Control (which filters out MIDI data), Input Transformer, MIDI Event Playback, Note Expression MIDI data, VST Expression Map Midi data, MIDI inserts, MIDI sends)
Why is the VST Note Expression MIDI CC data for example NOT being processed by the MIDI inserts, whereas the MIDI Controller Lane CC data is…

(Use case for altering the MIDI channel # of the VST Note Expression MIDI CC data: I use Note Expression CC data to send it out a virtual MIDI port via a MIDI send. A Generic Remote Template receives @ this virtual MIDI port and automates VST parameter. Thus, I use VST Note Expression MIDI CC data to actually automate VST Parameters, with all the benefits of having CC-data linked to the MIDI note…; as the VSTi reacts on MIDI CC too, I set up the MIDI tracks’ Inserts and Sends to be able to filter different CC/transform to different MIDI channels, so that I can control exactly which CC is sent to the VSTi and which CC is sent to the Generic Remote via the virtual MIDI port, which did not work, as VST Note Expression MIDI CC data is NOT being processed by MIDI inserts…)

B) VST Expression Map Remote
(mentioned in the post above. Direct Link: EastWest Sounds)

The MIDI notes which remote controls the VST Expression Map Sound Slots are processed BEFORE the MIDI Input Transformer to the MIDI track.

@JHP: still not understandable? Do you have further questions?

I hope you’ve got sufficient information for providing a detailed documentation of MIDI routing within a track?

Thanks for you help!


Hi, :slight_smile:
Ok, I have asked engineering for a diagram but this does not exist.
So for me to tell you how exactly the MIDI data flows is not so easy. In fact I am in the same blackbox situation that you are in.

Blackbox testing:

Your explanation is quite abstract. So lets leave all the theoretical stuff about MIDI signal flow aside and clarify what exactly you are trying to do. What exactly is the usecase? What kind of music are you making?


He want to Edit the NE-CC-Data to send it where he want:
With the CC Number, Channel (and Midi-Interface) you can choose what the data does, like he set it up in the GR, MidiTrackOut>MidiLoopBackDriver>Generic Remote>all assinged to the >Quickcontrols in the 16 first Midi-channels in his .cpr’s.
and with those quickcontrol he can control what ever he want, and he want to use NE to control the “Quickcontrols”.

As regards the Transformer Insert FX (and, presumably, the other MIDI Inserts too), I’ve just done a bit of experimenting, and it seems to me this is, quite simply, a bug (because the same setting in the Logical Editor works just fine)…
Let’s use a simple example… filtering.
Example #1)
There is some CC#1 data, recorded as Note Expression
We add a Transformer Insert FX to the MIDI track, and set it to filter (“delete”) CC#1. It doesn’t filter it.
(so, we might presume that this has something to do with the MIDI signal path… but maybe example #2 will show that that is not the reason :wink: ). On the other hand, if I use the Logical Editor, I can indeed delete the CC#1 data that is recorded as Note Expression.
Example #2) (hopefully this will illustrate my simple “bug” theory :wink: )…
Let’s use Halion Sonic SE as the instrument (because of its “true” Note Expression implementation).
Activate “Tuning” to be controlled by the PitchBend Wheel, and record something. (N.B. what gets recorded is, in this case, not the Pitchbend wheel, but the VST3 “Tuning” parameter).
Use the Transformer Inert FX… it is normal (see the line I just wrote above), that trying to filter “Pitchbend” will have no effect.
Set the Filter Target, instead, to VST3 Event___Tuning (which suggests, because it appears there specifically, that the Transform FX is intended to act upon VST3 events)… Set the Transformer FX Function to “Delete”. It doesn’t filter out the VST3___Tuning data either! (that, to me, says, quite clearly… “Bug!” :slight_smile:
Once again, using, instead, the Logical Editor, with the same settings, does delete the VST3 data.

time to let it be properly documented by the white boxers/engineers then, don’t you think? :mrgreen:

Thanks anyway. There doesnt need to be a diagram, but at least some detailed description of the intended MIDI flow design?

My own :wink: My primary goal is to badge audio application developers :mrgreen:

No, I once wrote a guidance about realtime converting MIDI to VST Automation and vice versa:

I got a request from a user, who asked me to produce a video explaining it visually and I wanted to do just that so I prepared some stuff to record a video, which was, when I encountered the “MIDI routing” issue:

Control VST Parameter Automation Using MIDI CC / MIDI part/event editing capabilities

I’m coming from a Sound Design perspective: Imagine, I have a VSTi Instrument track #19. I use a filter VSTfx insert in slot #3 of this track. BOTH VST plugins “form” the desired sound. The VSTi is played by MIDI notes. The filter cutoff is “played” by Automation only, as the VSTfx doesn’t have MIDI input, for example.

In this case, with Cubase, I’d usually have a MIDI track for playing back the VSTi, and an automation subtrack, controlling the filter cutoff of the VSTfx.

This, too me, always “feels” weird. Automation subtracks in Cubase are rather meant for mixing/mastering tasks, in a “timeline manner”, not in a musical or “sound designing manner”. In the mentioned case, due to the filter cutoff “belonging” to the “sound” of a single note, I don’t want to edit MIDI notes and filter cutoff automation separately on different tracks/editors, in “different contexts”, I rather want to be able to edit the “sound” of a single note (which includes the filter cutoff) in ONE place, “in a musical context”.

The best way to achieve this is the Piano Roll editor. This is where I “draw” notes and where I can draw (Note Expression) MIDI CC data to achieve the “sound” for a single note. As said, the filter VSTfx in this theoretical case is not controllable by MIDI CC, so I need a way to control the filter cutoff by MIDI CC data.

I use a virtual MIDI Loopback Port and a Generic Remote, to “convert” MIDI CC data to VST Automation Parameters and vice versa: I send MIDI CC data to the virtual MIDI Loopback Port, a Generic Remote receives this MIDI CC data and “maps/converts” it to - remote controls VST Automation Parameters.

The Generic Remote adressed parameters however are always “absolute”: I may create a Generic Remote which receives CC#0 values from the virtual MIDI Loopback Port and remote controls the “filter cutoff” parameter of the VSTi Instruments Track #19 (the Track # where the VSTi is slotted in) Insert Slot #3 (where the filter VSTfx is slotted in on this track). But when I insert a track before track #19, or move the VSTfx from Insert Slot#3 to slot #5, this mapping is “broken”, as the Generic Remote allways adresses VST Automation Parameters in an “asbolute kind of way”, ie, after altering the track/insert order, the Generic Remote still controls a Parameter # of Insert Slot#3 of Track #19, even though the VSTfx isn’t there anymore…

To overcome this, I created a generic Project Template. THE FIRST 16 TRACKS in every project of mine are 16 unused MIDI dummy tracks #1 - #16. The mentioned Generic Remote receives MIDI CC#0 - CC#127 on Channel #16 from the virtual MIDI Loopback Port. These control the Quick Control Slots 1-8 of track#1 with MIDI CC#0-7, the QC Slots 1-8 of track#2 with MIDI CC#8-15 and so on…

Thus, I have a virtual MIDI Loopback port, I can send MIDI CC#0 - CC#127 on Channel #16 out of this virtual MIDI port from anywhere within Cubase, the Generic Remote receives from this virtual MIDI Loopback Port and “remote controls” the first 128 Quick Control Slots in every project. I can then assign these first 128 Quick Control Slots to any VST Automation Parameter in the project. I assign for example Quick Control Slot #1 of Track #1 to the filter cutoff parameter of the VSTi Instrument Track #19s filter VSTfx in Insert Slot#3.
When I alter the order of tracks after these first 16 dummy MIDI tracks in the project or when I alter the order of Inserts, the Quick Control assignment “follows”. When I for example drag the filter VSTfx in Insert Slot #3 to the Insert Slot #5, the Quick Control assignment stays intact and then still adresses the filter cutoff parameter of this VSTfx then in Insert Slot #5, because the Quick Controls, other than the absolute Generic Remote adressing is RELATIVE and follows changes to the track/insert order!

So, I have a MIDI track routed to the VSTi, I have a MIDI part on this track, I open the Piano Roll for this part and draw a single note. I doubleklick this note, enter the VST Note Expression Editor and draw the filter cutoff curve for this note using MIDI CC#0.

With a MIDI send on this track, I send the MIDI data out to the virtual MIDI loopback port, Channel #16. The Generic Remote receives the VST Note Expression MIDI CC#0 values on Channel #16 and controls Quick Control #1 of track #1, which is assigned to the filter cutoff of the VSTfx…

I edit the "sound"s parameters for this note in ONE place: In the Piano Roll / Note Expression Editor

Without further adjustments, the MIDI CC#0 data from Note Expression is routed to both the VSTi AND to the virtual MIDI loopback port via a MIDI send!

If the VSTi itself reacted on MIDI CC#0 (and for example changes presets on MIDI CC#0 bank change), this would be undesired behaviour, as MIDI CC#0 is meant for the VSTfx, not for the VSTi, although belonging to the same “sound”. So I need a way to filter out MIDI CC#0 for the VSTi, and, preferably, filter out anything but MIDI CC#0 data for the Send to the virtual MIDI loopback port.

This is why I

  1. setup a MIDI Insert fx Transformer, which deletes MIDI CC#0 events (so the VSTi wont receive MIDI CC#0 anymore), and

  2. switch the MIDI send to “Pre-Insert”, use a Transformer, which deletes anything but MIDI CC#0 and outs to the virtual MIDI Loopback port on MIDI channel #16, having the Generic Remote effectively only receiving MIDI CC#0 data from this MIDI part.

However, I noticed that the Transfomer Insert in 1) is NOT being processed, ie the VSTi still receives MIDI CC#0 data!

Which caused some black box testers surprise and iniated this thread :wink:

In summary:

Note Expression is the perfect way to control a single notes sound parameters in one unique place and in a musical context, no matter of how many plugins actually “form” the sound.

I need a way to

  1. edit a “sounds” parameters (which consists of any number of plugins in the "sound"s audio chain) in a musical context, ie, in ONE place, and this currently is only possible and preferable with the Piano Roll / Note Expression

  2. distribute Note Expression MIDI CC data for a single note to other targets (MIDI ports/channels) than the MIDI note data itself, as the sound for this note may be formed by any number of plugins)

In order to accomplish this, I first need to know exactly how MIDI from multiple sources to/within a track and to different sinks in a track is meant to flow though the track. I need insights to the black box :wink:

You could as well short term develop a way to include any VST Automation Parameters in the Note Expression editor, if you prefer :mrgreen:

so, in short, the MIDI Inserts aren’t processed for Note Expression data of any kind.

Might be a bug, but might be by design, too, there’s no need for an Apreggiator on Note Expression data for example… I don’t know.

And because I don’t know, I want to know how MIDI is intended to flow through a MIDI track…

edit: No, something here IS a bug :mrgreen:

A) Either VST3 events arent’s available in the Transformer when used in a MIDI Insert (in the Transformer in a MIDI send they work as intended) - this is, when Note Expression Data is not designed to be processed by MIDI Inserts… so, the (GUI) bug would be the appearance of VST3 events in the Transfomer used in a MIDI Insert.

B) OR the bug would be “Note Expression Data of any kind not being processed by MIDI Inserts”…

I “hope” B) being the case here, and I surely hope for a quick bug fix :mrgreen:

Don’t think of the Note Expression data “in isolation” :wink: … one might very well wish to arpeggiate notes in a Midi Part, that have some note expression data.
I have just tried a slow arpeggio (using Halion Sonic SE) in Arpache 5 (playing half-notes), with some “Tuning” Note Expression during the first quarter-note of one of the notes.
The arpeggio plays o.k. but the Tuning parameter is not heard.
So it does seem, more and more, that Note Expression data is simply “stripped out” of the MIDI Inserts.
[Edit)] I should say, “stripped out of certain MIDI Inserts”… it does pass though many of them, unhindered.

ugh… presumably due to the complexity which might grow exuberantly… imagine notes with note expression data (vst3events/midicc) arpeggiated: with the arpeggio vst3events/midicc data gets duplicated and “overlaps”…

what to do with overlaps? (Hey, I already feature requesteda Realtime MIDI Consolidation function… Seems it even is necessary…)

I guess, it simply was a design decision for that reason, to strip out Note Expression Data from MIDI Inserts…

Oh, I fear this “bug”, or “design decision” won’t be “fixed” soon… lot of (re)design work here…

hmm… have I found a “workaround” for your excellent descriptive post? :wink:
Write your CC#s as Note Expression in the way you desire, but then, select the desired events, and MIDI Menu>Note Expression>“Dissolve Note Expression”, so you can now have it going through the Generic Remotes as you wish.
If you then nedd to move the notes at some later stage, you can just (temporarily) do the reverse… select the desired (regular) CC# data, and “Convert to Note Expression”, while you move the note(s) around, then dissolve them again.

Drawing notes/Note expression is done “by ear”, so the part is looping and playing back immediately with me wanting to immediately hear the result, so this is no option, but thanks.

Usually, I prevented MIDI clashes between VSTi and VSTfx with the VSTi itself (doing nothin on MIDI CC# meant for VSTfx, if there even is any… But I wanted to do it failure proof.

Currently I route the track to nothing and use 2 MIDI sends with Transformers to route to the VSTi and the VSTfx, but experimenting with it I think I had latency compensation issues? Not sure though…

So, why do you need to do it via Note Expression, as opposed to “regular” CC# data? (and just convert to Note Expression if you later need to move the notes around?)


Usecase 1:
As a user I would like to control any parameter of any plugin the same way it is possible to handle data with “VST Expression 2” functionality. Of course for VST2 Plugins this would be monophnic. Still the supreme handling of note linked automation with “VST Expression 2” makes this disareble.

Usecase 2:
As a user I would like to control the QC of a track with the “Note Expression 2” feature or a funtionality that is like “VST Expression 2”. The supreme handling of note linked automation with “VST Expression 2” makes this disareble.

TabSel, could you write an exact usecase for your request/intention in the simple form I exemplify above? Or would you agree that the uscase I have outlined bring your intention to the point? Do you have anything to add there?

By the way … much props for the TabSel method you have discovered in the awsome “MIDI CC zu Automation und Automation zu MIDI CC” thread. :slight_smile:


Usecase#1 is it then :wink:

Any word on then “bug” found here by vic though?

As JHP said:

The supreme handling of note linked automation with “VST Expression 2” makes this disareble.


Here is my Usecase:

As a user I would like to control any parameter of any plugin exactly the same way it is possible to handle MIDI and polyphonic VST3.5 event data with the “VST Note Expression 2” functionality.
As a user I further would like to be able process any MIDI including note-linked “VST Note Expression 2” data of any kind with MIDI insert fx.

Of course for VST2 Plugins this would be monophnic like MIDI is.
Of course, monophonic VST2 plugin parameter data supported within “VST2 Note Expression 2” functionality leads to possibly colliding VST2 parameter data with existing automation subtracks asdressing that same VST2 plugin parameter, like monophonic Note Expression MIDI data may collide with Controller Lane data.

As a user I would like to have Cubase care for collision handling, by having the current “consolidation” offline functions processed in realtime for monophonic note expression data like MIDI and VST2 parameters BEFORE or WITH a MIDI insert fx.

The supreme handling of note linked automation with “VST Expression 2” makes this disareble.

Too much words?

No, that’s ok. This is a usecase I can pass on to the respective departments. :wink:

Regarding the “bug” found here by vic:
I would not call this a bug. VST Expression 2 is not really MIDI but the MIDI Insert is and has always been a MIDI processor.

You can use the “Merge MIDI in Loop” command to write the arp into the MIDI data stream. Now select one note, double click it to open the VST Expression 2 windows, select all of the data (Control +A) and copy (Control+C) the VST EXpression 2 data. Now hit the cursor right button on your qwerty board and hit Contrl+V. Do that in a fast reparative way to quickly jump from one note to the other and paste the VST Expression 2 data into it.


Hi Jan,
If that is so, then why does the Transformer Insert FX offer “VST3 Event” as one of the options in the Filter Target? (but, like I said, it doesn’t actually work) :slight_smile: