Add Midi Remote style features to External Instruments (to create a Device Panel)

EDIT - I’ve left my orginal post below, but have aprovided a summary of teh feature request as it’s grown through discussion in this thread

This feature request is for replacing the current “Device Panel” system for control of External midi devices with a new and simplified system similar to and possibly incorporated into the new “Midi REmote” system introduced in Cubase 12. The intent of the feature is:

  • To allow direct control of your external midi hardware from within cubase via a user created panel, which allows users to click and move a knob or slider on the panel to control / program the external device; AND allows users to use the panel objects (like knobs / sliders) to automate external midi hardware within cubases track automation system. This panel would also, of course, be controlled from the hardware too so physical changes to the hardware knobs / sliders / switches would update the panel GUI.
  • These panels can be directly assigned to an “External Instrument” or “External FX”, so that hardware can be integrated into cubase as though it were a VSTi or VST.
  • The user created panel would adopt the same creation process as Midi Remote. “Move a knob on your synthesiser; it appears in the device panel during setup”… This allows users to very quickly develop panels to program and automate their hardware. For external midi devices that do not have knobs and sliders, the system would require manual input of an object into the panel, and manual definition of messages - this is exactly the same as the current Midi Remote setup, where objects can be manually defined if needed.

I propose this would be:

  • Likely best incorporated into the current Midi Remote system as a DIFFERENT type of external midi device to a controller. This feature does not need to (and shouldn’t) add unecessary features to setting up a controller (which is the current intent and sole function of Midi Remote). e.g. upon setup, users would select “Controller” OR “Device Panel”, and these would be separated by either tabs, menus, symbols, or filtering options (not all at once… these are just options to consider). This keeps things neat and tidy for users.
  • A system that is able to name objects (similar to current naming fields BUT user defined not autogenerated- so you can name something e.g. “cutoff”; “Filter Attack”) which means that object naming is logical for programming and automation. These names would need to be registered within cubase’s parameter naming system (similar to VST VSTi’s) so that they can be: automated (i.e. “cutoff” is then an automation lane AND so they can be separately targetted by a Midi Remote panel if desired (this allows synths and devices with poor physical controls to be remote controlled by another Midi Remote device for improved hands-on performance and programming).
  • A system that includes patch storage. At the basic level this would be simply storing the positions of the Device Panel within the current Cubase present storage system as perh any VST / VSTi; and where loading a project or a new preset simply sends a series of parameter changes to the external device. At a deeper level, I think a new object with the Midi Remote style protocol could be a “sysex dump” or “program change” object, that allows users to dump and send midi program change data and also select a phsyical location on their storage devices (i.e. hard drive) to save and load data from.

Original post below…

First up - MASSIVE thanks to Cubase dev team for allowing users to select multiple audio inputs simultaneously across audio chanels and “external instruments” and “external FX”. It’s a total game changer for users like me who have a lot of outboard gear. I’m incredibly pleased about this…

Not to sound ungrateful, but just wanting to strike while the irons hot, as someone in dev has obviously looked into new opportunities within “Exetrnal Instruments” recently, hence the changes to in/out audio and midi routing options!..

SO:

Another INCREDIBLE feature for hardware users (e.g. synthesisers) would be to allow the current Midi Remote style features to create a device panel for an external synth (e.g. linked to an “External Instrument”). Imagine if all you needed to do was select “Create New External Instrument Panel”, and then use the Midi Remote style setup to simply twiddle each knob or command on your synth for virtual knobs to appear in a panel… arrange them as you like; even perhaps add a custom background (e.g. image of your synth, or whatever you like). Click “OK”, and then BOOM - you have a panel that opens up like a VSTi with direct control of your synth via this panel, which has two-way communication bewteen hardware and panel interface AND allows each named panel object to have an automation lane ready for recording automation of synth parameters. You could even allow patch saving within the current cubase patch library (which upon patch change just spits out control changes).

Current Midi Remote is SO close to this, I wonder if most of the work is already there>??

Please?

Pretty Please!!..

2 Likes

Just a little tiny bump…

If anyone with a hardware midi unit (synth; drum machine; FX unit) could please have a read and post a comment or discuss, that would be great. If you agree that simple control and easy patch librian for hardware would be amazing, then please pop a vote in here!! We’re 90% of the way there with Midi Remote… let’s just expand it to control of hardware too!!! :slight_smile:

I imagine this post will quickly become buried in all the other posts (which are all important too of course!)… I’d just hate to have this idea missed by people who’d actually love it as a feature!.. and voting does seem to count for the cubase dev team’s focus…

I think this feature would instantly make Cubase THE DAW for hardware midi users. For Steinberg, this means an awesome feature that attracts a customer base (“own a midi synth; you probably should use cubase”). For users, it means a simple and friendly workflow and community vibe, with control panels shared for any midi hardware we have. For hardware companies, it removes the burden of developing and maintaining custom software interfaces.

I think what you’re asking for is an upgrade to Device Panels. None of what you’re talking about really has anything to do with MIDI Remote.

Thanks for the comment :slight_smile:

It’s actually a yes and no answer though…

The “no answer” is: Primarily what I’d like is for Midi Remote to be bidirectional. That is, move a fader in the remote GUI and it sends data back to the hardware. Currently it is uni-directional as a GUI (the GUI only recieves data from either the controller or the mapped destination) - it works as designed well (to control cubase from external controller); but it has untapped potential for bi-directuional controller setups to recieve AND control external midi equipment.

Simplest example is: I can SO quickly setup my Bass Station 2, mapping every knob and slider into a nice GUI using midi remote… It would be incredible if I could then use that Midi Remote GUI to program my Bass Station 2… all the mapping is there; it’s just not a function available in the current MR setup.

The “yes” answer is: Well, yes!.. Being able to assign a Midi Remote as a device panel (as opposed to “just a controller”) extends this functionality the next step. Now this Midi Remote GUI is directly assigned to an External Instrument VSTi; and along with it’s ability to remote control and program your synth, we have PATCH STORAGE!.. either simple “dump settings” to synth; or integrated with patch change commands if the user prefers. Either way - holy crap… Cubase has just become a full and simplified Midi librarian for all of you midi hardware. That is some jaw dropping functionality in my opinion; and fills a huge gap in the current market for librarian software.

Almost all of the above is part of the current function of Cubase. I know it’s foolish to suggest “they have it all there waiting to go”, as coding and dev takes time of course and adding functions isn’t as simple as hitting enter on the keyboard; but in honesty… they ARE remarkably close to this functionality already…

Interested in your thoughts…

I understand what you’re saying. I have some thoughts on this though.
The MIDI Remote surface is not a GUI. It does not exists for the user to interact with directly. The whole purpose of MIDI Remote is to create a bridge between MIDI messages and VST parameters. In versions <12 this was accomplished via the Generic Remote. The two basically does the same thing. The beauty of the MR panel and the Mapping Assistant is making the mapping of MIDI to VST Parameters more user friendly.
The way I see it, there is no need to turn a MR panel into a GUI because VST parameters already have GUIs (the plugin GUIs, the MixConsole, etc). What you are talking about, a panel where the user can assign graphical controllers to MIDI messages for easy control of external hardware, is in my view outside the scope of MIDI Remote.

However, a user definable GUI for controlling MIDI devices already exist! The good, old MIDI Devices. It fits your description. You can assign faders, buttons and knobs to MIDI messages and create a GUI you can interact with. You can even create patch lists that enables you to select a patch name of your MIDI device in the Inspector.
Sounds fantastic, right? Well, it would be if MIDI Devices received a massive upgrade. It is tragically archaic in its current state. It does what it says on the tin, but the tools are a bit of a nightmare to work with. This is where I feel Steinberg can take advantage of their success with MIDI Remote. They could use the same building blocks used to make MIDI Remote and apply them to MIDI Devices, but I do not think the two should be combined.

1 Like

Ah yes, cool. It seems we’re talking about the same thing indeed, in a slightly different context, but yeah - effectively the same.

I guess in my head, having access to Midi Remote as such an impressive setup for my motorised controller setups (not sure if they’ve fixed the feedback bugs; but I’ve used BOME translator to circumnavigate that anyway!)… I’ve envisaged this sitting nicely in that system too. That said, I couldn’t care less if it were separated out into “device panels” setup with the same fundamental process of “move thing; it appears; accept; move next thing; etc etc”.

I agree also that there’s zero need for a controller to have GUI control - I’d envisaged to be an alt setup within Midi Remote (i.e. select “controller” (no GUI control); Select “External Midi Device” (GUI control)…). I can’t imagine it would be too much of a stretch to extend the GUI control we have everywhere else in cubase (click; hover etc) to the Midi Remote GUI. But again, whether contained in Midi Remote or simply an updated “Device Panel” system based directly on Midi Remote interface doesn’t particualrly bother me :slight_smile:

The only slight advantage in my mind would be those cross-over situations… e.g. having it contained within Midi Remote would be VERY cool for control of mixing consoles that share internal systems (within the console) whilsts ALSO being able to be used as a controller. A quick example is an Allen & Heath QU16 (but there are better example I’m sure), where I have set it up as my controller for cubase control… it would be next level if my CUBASE Midi Remote could also setup additional pages to control the QU16’s own internal functions (which are all midi programmable) like mic levels; sends and compression etc. I know they’re all available on the board; but the extreme depth of Midi Remote could program a far freindlier and customised user interface easily if it was bi-directional communication.

But I digress… I REALLY want this feature for my synthesisers, drum modules and external midi FX :smiley:

Device panels… it seems you know all too well lol. It’s painful (and effectively unfinished). I see what they were hoping for though. It’s biggest draw back is by far not it’s mind numbing setup though…it’s that, once again, it’s not bi-directional. This time however it’s the other way around: you CAN program your synth, but it CAN’T recieve midi externally. So the actual panel can never update from your synth (it won’t reflect changes made on the actual physical synth). You can’t automate your panel either :frowning:

It’s genuinely funny (as in makes me do that “isn’t it ironic” chuckle) - Midi Remote can’t quite get there because it can’t send midi data externally; Device Panels can’t because it can’t recieve midi data externally… and together they would work, but midi remote can’t target Device Panel objects and Device panel objects can’t even be targetted by quick controls as a workaround… Lol… like I said, makes me laugh.

Thanks again for your input. Look - don’t want to push or shove the idea… have a think though, or keep chatting… if you see genuine useful potenitial for you or others, please do give it vote :slight_smile:

1 Like

The Device Panels are buggy as hell.
The editor is totally inflexible.
The procedure despite the manual is a nightmare.
One wrong step during input and you start all over again.

An editor like the Midi Remote, but where you could assign Midi CCs, would be a dream.

Hardware synths are popular again these days, and I don’t understand why Steinberg doesn’t take Device Panels more seriously.

2 Likes

Mind clarifying this? As far as I know, MR can send MIDI to outputs.

I know, right!.. A total dream come true hey… Thanks for your vote too! I really hope this can get some attention on here, and also hiope we can get more ideas from others.

I might edit up the opening post at some point to refine the concept as it goes too.

You’re spot on about hardware synths too - while soft VSTis and FX are an obviously strong and professional choice; the market for hardware seems to actually be growing too… and yet there’s really not a single DAW that integrates hardware midi elegantly (or, even at all in a true sense of integration). I think this would actually be a real selling point for Steinberg too - I know it would be for me.

If you have any further thoughts on how it would work for you or any other “requirements” you’d have, pop them in here in this thread and we can discuss them and add them to the “feature list” this would have! :slight_smile:

1 Like

I suspect because a hardware vendor can actually implement a VST supporting the synth. For example, here I have 2-3 synths which do have VSTs for editing their params. Just a thought, nothing to do with the FR…

Hi m.c - thanks for chiming in :slight_smile: I’ve followed and think I’ve also joined in on various discussions with you on MR on the forums. Nice to see you here :slight_smile:

What I was referring to here is that the actual MR interface you develop can’t be user controlled - as in you can’t move a fader in the actual MR page and have that send midi data back to e.g. a synthesiser. The benefit here would be using this interface to e.g. program a synthesiser.

We DO have the transmit to hardware option in MR which works well for controller setups (so they can update based on changes in cubase)… So this option effectively allows the hardware to send midi to cubase; and the targetted item in cubase to send that same message back to the hardware… just not via changes to the actual MR GUI. You can also target, as far as I’ve found, an automation controller lane for a standard registered CC… so this sends externally in theory, and also in theory can send a different CC to what what’s recieved… it’s how I’ve currently set up a test system for synth automation, but setup is a bit clunky and I’m not finished in testing and it may not work as intended.

This “can it send externally” is actually a good point… it has me realising something (for the feature request). While at a simple level, all you’d need is 1:1 mapping of midi data (e.g. this item recieves CC:50?.. well it also sends CC:50), at a deeper level you’d probably want this feature to recieve something, and send ANYTHING as a result… AND you’d actually want this External Instrument Device Panel version of MR to actually be controllable from another MR script. Why? Becuase if you have a synth like a Matrix 1000, with no onboard editing capabilities (must be programmed remotely, this would allow you to control and edit it from both within the Cubase GUI and then also control it with another controller you have set up… another “very cool” part of how this could work!!

To get my head around things, perhaps you can help me with understanding something, as I’m a little rusty on MR (I set everything up in 12 when it came out). In the current MR, are you able to edit the destination to send to “any” external channel and with any message… I feel like I’m missing a deeper menu somewhere in the MR mapping assistant… e.g. can I set up an MR item (like a knob or fader) to send:

  • CC of a user nominated number to a specific midi port and channel?
  • Send NRPN of various nominated bit formats?
  • Sysex (e.g. a custom sysex command, typed out in an data entry box… as some of these can be quite long / custom / complex… but of course necessary!)?

Using scripting it’s absolutely possible. However, I’m not sure I understand, because it seems to me that we have two different concepts here.
a) The concept of the MR Window to act as a real UI, i.e. sending info when we move something in it, not possible right now as you correctly state, and
b) Translating messages to send to a hardware synth. This is doable via scripting.

1 Like

Yes, but not all synths.
In addition, you are dependent on the maintenance of the manufacturer’s plug-ins.
VST2/VST3 is another thing.

A Device Panel like the Midi Remote with buttons, fader etc, where you can assign Midi CC and label it (Reso,Attack…) would be possible.
A list of all the Midi CCs that can be sent and learned from the hardware controller is similar to the Midi Remote, except that it actually sends the Midi CCs.

2 Likes

I know my friend, I know. I’ve just mentioned it as an alternative view on the whys behind the device panel not moving… But who knows, everything needs time, maybe it will be altered in the future :slight_smile:

2 Likes

I’m not a programmer. :wink:

2 Likes

I would really like to see an overhaul of the Device Panel editor, that is being used within the MIDI Devices. The current one is clunky to use. I envision the creation of a device panel could be made much more user friendly.

This is where things won’t go your way. Not every external synthesizer sends MIDI messages for every parameter they have. Especially those sound modules, you mention the Oberheim Matrix 1000. Most of them are designed to rather receive MIDI than to send it themselves.

The MIDI standard has no concept for a two way communication. If it had the MIDI sequencers like Cubase, Logic, Performer, Opcode Vision, Cakewalk, and so on would have supported that in the 1990’s already.

As far as I understand the new MIDI 2.0 standard will allow bi-directional communication but that means, of course, that it will only work with newer hardware synths that will support the new standard.

It has a lot to do with the FR. The vendors are using the VST plugin in order to allow bi-directional communication between the DAW and their product because MIDI 1.0 has no such concept.
While MIDI is great as a communication protocol because it is widely supported it is basically crap in a lot of things. VST has overcome a lot of MIDI’s short-comings. MIDI 2.0 will, too, but I fear it is about 30 years late.

1 Like

Please don’t destroy my memories from youth :smiley:

Yeah, I guess scheduling vs executing is time consuming :smiley:

1 Like

Yes! Sorry, had my daughter running around tapping the keyboard, so wrapped the post up quickly lol…

They are two different topics, but broadly part of the same concept… just wanting to get my head around whats currently implemented and what might be missing from the current system so i can update the Feature Request to be thorough in whats required.

Great that you can script any command… really really great in fact. I think ideally it would be super user freindly, but thats a good start (that its already possible). I think the “super user freindly” version would be the MR midi input box (that summarises the incoming message) simply being duplicated to right as the “send” message with a checkbox for “mirror recieved message”… 90% of the time, you have that checked as you probably just want the same message sent recieved, but in some scenarios itd be great to customise an outgoing message and having simple user input would make it more accessible to broader less nerdy users. E.g. as noted above, some people wont easily jump in and script up something.

I just looked up the manual of the Oberheim Matrix 1000. You know what the “MIDI transmit” section looks like? It’s a dump (pp. 41). No individual settings are transmitted, only a bulk dump is possible. Ie. everytime you change a value in Cubase to be send to the Matrix 1000 it would send a bulk dump back, which might block the device and the midi channel for several seconds.
And then Cubase would need to know how to read this bulk dump, because there is no standard for it. That means every single external synth needs to get its individual support through Cubase.

You have a nice idea. It just isn’t doable in reality. That’s why no DAW has done it so far.