MIDI Remote - Defects & Lesson Learned

I will not say anything here anymore and just STFU, if you can confirm that this is really a API bug. It could also be a bug, how we use this function. And to be honest, i am not 100% sure about this. You are?

Nothing towards your person at all.

The point being that, if the MIDI Remote was based on MCU, you’d expect it to be able to actually emulate or run as an MCU. But it can’t.

i.e. it’s not able page through parameters, select specific insert slots, or any number of things that MCU protocol allows for. There’s not even a standard jog control, you have to create it yourself via triggers to define forward and reverse actions based on direction turned.

I appreciate that you don’t accept what’s being told to you, but can only answer truthfully to what you’re asking.

The other thing that SB don’t want to be actively accepting (Due to it being a paid feature), is that the MIDI remote is almost in a public beta stage right now. So there will be oddities that don’t make sense, fixes required, functions missing, documentation errors, example errors - which is clearly frustrating you. As it is a lot of us.

But that’s a whole other topic to whether it’s based on MCU or not.


Hi skijumptoes :slight_smile:

This is a relief to me. I guess you are a native english man. If so, please avoid such phrases and metaphers. This is a multi-language community and i am not familiar with such phrases. Since there was no answer from you yesterday, it was the icing on the cake and really tilted me hard.
So again, in your own interest, avoid things like these.

That was a lesson i learned yesterday. I did a mistake to mention MCU and revised my statement. If you just replace MCU with “the vendor/company that was first to have a display and this display is supported by Cubase”, my previous posts would make more sense.
I dont care which company it is. My main point with that, was to explain that there are many things, where Steinberg dont need to reinvent the wheel, as it was done before since decades. It would be dumb to not use this resources and build the API completely new.
That is why i wrote “based” and not “tied” to MCU. Sadly every other person yesterday, jumped on to the MCU waggon and not to the thing i tried to explain.

Thats why i wrote that the API is capable of doing the same things like MCU, but not all. Some stuff is missing, but not to much.

You all said yesterday, that the MCU (omg again) uses standard midi communication. For this communication, the “hooks” to Cubase where made long time ago. Why you would create new “hooks”??? I would use these “hooks” from the past and put them one by one into the new API. Because API is not different from standard midi-communication. The API index is nothing more than these “hooks”. That is why i see so many, many similar things to MCU (or other vendors).

It is just sad, that the API index is documented like $hit in first place. Which only leads to heated discussions, where people hairsplitting, if API has similarities to MCU (or other vendor).
Since its poor documentation, people start to brew their own soup with their own scripts, resulting in code that reads different EVERY time and for EVERY script.

It is very easy to imagine, how that would look, if the documentation would have been readable and nicely explained, instead of putting something out that looks like the current state of API.
ALL scripts would be looking somehow similar (you dont reinvent the wheel, if you are smart).
Sadly, this opportunity is over and leaves “normal” people way behind, as they cant do something alone with the API, except to poke with it a little bit and spend precious time with trivial things.

Sadly, this pretty much dumb approach, splits the community, into people that are frustrated (like me) and to hyper-active fanboys that jump in and tells us (frustrated) people, how wonderful the API is working on their cheap-$hit controllers with absolute encoders (probably not having even a display). This is the point where i speak to myself “hey fanboy, you could do this decades ago with Generic Remote for your cheap-$hit controller. You simply dont need a new API, to do that. So piss off.”

Sadly, the hype around API is more worth, then crying people that cant get a simple LED to work, even after months. Dude, a LED… months… puking. How can i tolerate that hype? How is it possible, that a hyped API does not offer fundamental things like these?? No, you need to write it for yourself. SAD.

As long as you can create a jog control with JavaScript API code, it means you can not tell it is not possible, it just need work to do so. SAD.

It’s an English language community and you should make an effort.


This is a forum in English. If you’re not familiar with certain words or phrases, look it up. You can’t ask other users to adapt their language based on your level of understanding. That is ludicrous.
Also, Mr. Skijumptoes did not say anything offensive.

1 Like

English is hard. I’ve made an effort to learn 7 languages in my life, and even though English is my first language, it is by far the most difficult.

@u-man, if you need help with code, then post it. Fork the GitHub branch and push your code and post the link. Many on this forum will be willing to help.

I needed help at first and got a lot of help from many kind people here on the forum including Jochen , Martin, Skijumptoes, Nico5, MarcoE et. al.

Post your code, and let us figure it out together.

That’s kinda my main issue with where the MIDI Remote is right now. It’s fantastic at more immediate ‘friendly’ mappings. But as soon as you want to delve a little deeper you have to jump into the API.

Now, I always imagined the API to be a vendor (Manufacturer) facing environment, so expected there to at least be full array of existing functions from the MCU and/or the generic remote off the bat - because they are what most generic devices are using currently.

Placing a message into the generic remote sections that it’s going to be discontinued and recommend to use the MIDI Remote as an alternative - ok. But when the MR is not able to perform those tasks previously -SB Are then pushing people down a vendor API Route, so no wonder the confusion.

It feels that this needs another year of beta testing to me, the documentation and implementation is messy, and most simply bolts on top of what is already there - hence the restrictions already in place. - Perhaps this ‘bolting on’ to what’s there already is what you term as ‘based on MCU’? If so, I understand the point you’re making.

We’re still left with this restrictive 8 parameter VST Quick Controls, Track Quick Controls and now Focused Quick Controls system - The repeated use of ‘Quick Controls’ that applies to multiple elements is just weird, and the refusal to allow us any kind of paging system within them is just maddening.

It should’ve been written from the ground up, and then it could’ve included the ability to manipulate events, MIDI note data and colour coding from instruments such as Groove Agent so that devs could utilise that for pad and key RGB. And also mapping to plugins on a class level, not slot/track.

In fact, That’s what I was expecting it to be. Not tied to current control layers, but a direct route to DAW events and plugins.

Also, as many have called out, the API is based around the ES5 revision of Javascript which is quite antiquated (old) now, and I think that only adds to the frustration when you consider that this already just feels old and backwards to code within.

I was very happy with the immediacy of the creation of a script, and the mappings assistant within Cubase itself, but it feels that getting a visual representation of the controller in Cubase was a higher priority than actually considering finer detailed elements such as multi-page mappings, focusing/opening insert slots, mapping single cc jog wheels, encoder scaling etc. - Most of which we were doing beforehand with generic/mcu devices.

Maybe there’s more expansion to come in the future with MIDI 2.0+VST3 only, but in terms of genuine improvement, I feel that they’ve gone down the wrong road here - Through fear of disrupting what was there already, I expect.

But sometimes you just gotta be brave, and believe in the project and make it better than what you’re ripping out. As I see it, there’s just more being piled on top. (IMO of course!)


I agree. I will say this though, as a seasoned computer scientist that has often been afforded the ability to choose my projects, it reminds me a lot of the sort of project that I would jump into with the the same intention, only to learn that the reality of the legacy code presented unforeseen challenges.

That is why I now fully embrace the idea of Evolutionary Design. What we have now is truly amazing. I suspect that a lot of hard decisions were made in order to include MR in the 12 release. There is plenty of room for improvement, but other than a few minor inconveniences, I was able to convert my GR mappings (which took days to create), into UI created MR mappings in minutes.

If the MR environment continues to evolve, provides an ES6 API, adds the features from GR that are missing, provides a two way event handling that does not create lag in the DAW, allows MIDI to be sent back to the device as chosen by the script writer, provides a pass through mechanism to allow for direct CCs (or even notes) on one page, but mapped controls on another, and allows for paging of focus quick controls, among other things, I am sure we will find more to be desired.

If the MR features fall to the wayside and go a decade with no changes, and then are replaced by something new, then it would be very disappointing. But that doesn’t seem to be what is happening. I only wish I could dig deeper into the code to see what is happening underneath and help the thing evolve. That’s a good thing because it means that I am inspired.

Now if I could only get inspired to finish some of these tracks…


You got it! Yes, if that’s the roadmap brilliant. Plus whatever MIDI 2,0 brings of course. :slight_smile:

I think what we have today is pretty much it though, I can’t see how they’re going to bring in specific parameter elements into the API for us to bind to. Or the selected track elements from GR, such as channel strip elements, even when they change slot/position.

There’s also no appreciation of why users would want to page through quick controls, which is concerning.

Unless the plan is to create the hardware layer via API, And then add the rest via the mapping assistant. Exporting that out as the final script?

If that is the expected method then it’s me that’s completely missed the boat on this one! As i’ve been trying to map as much as possible within the API and need to turn my thinking upside down. and only use it for definig the hardware.

But then, there’s elements you have to bind in the API to be able to translate the expected values correctly. And once you modify that hardware control it then becomes non-generic in the mapping assistant. :confused:


I did. It did not help much. I still had and have issues to understand the whole sentence and the only feeling that i had is, it did not sound nice to me. Thats why i wrote, that i dont understand it.
If then there is no answer (from him) and i already had not a good feeling about it, how would you feel?

Is it really important for you, to read phrases (like theses) that makes more questions then answers to a already complex thing like discussing API functions ???
For me it is not important. It is not contributing to anything and it is quite the opposite… irrelevant.
If you really want that kind of stuff… no problem, take the L.

What is the problem with you all? I am familiar enough to talk about API stuff and i gladly look up things that i dont understand, but phrases like these have no point, if people (like me) will not understand it as the whole or cant find a clear definition for this. Also see above, take the L too.

No one of you both, could explain it either, but have the balls and nerves to write such a comment? seriously?

Dude. I don’t even know what phrase you are referring to. Is it “the elephant in the room”?

My point is, you can’t tell people to adapt their language because you can’t understand certain things.
If there was something in a post you didn’t comprehend, why not just ask what they meant?



I’m hoping this is wrong, but that is what I am trying to do with VSTis and the most recent changes. You have to know what you are going to map to, but as you state, anything interesting requires the mapping to be known in the code.

I still don’t see the mOnValueChange issue fixed either. That isn’t a solution because it causes extreme lag in the DAW. You can’t use it to move faders, or change LEDs.

It may work for @Ron.Garrison though, but he would need to add page specificity to the event handling. (Ron take a look at how I did that here: https://github.com/oqion/midiremote-userscripts/blob/main/Novation/Launch_Control_XL/Novation_Launch_Control_XL.ts)

The way I see it, the first thing to solve is that everything is MVC like looking in one direction only, and that direction is the “Surface” as the View. But from the script writer’s perspective the View is the device. This happens a lot with real-time telemetry bases systems like flight control or mission control. And it isn’t even just UI related. It also effects what is “on-thread” and “scheduled” differently as well.

But that stuff is probably easy enough to resolve, just a lot more work to add a thread pool or whatever going in what would seem like the wrong direction. The real hard stuff, I suspect, is buried in where to put the hooks in to the legacy code.

Of course, without looking at the real code, I’m completely clueless.

1 Like

Oh man, I didn’t mean you were the Elephant @u-man.

The Elephant is that MR isn’t able to match the ability of MCU, which is the bigger issue being missed in that conversation on whether it was ‘based’ on MCU.

If that still doesn’t make sense I’ll go give myself a good slap, then you can feel good again! Your English is great, by the way. :+1:

1 Like

No one is trying to not be nice. You are a welcome and valued member of this forum.
That isn’t what you will get from Skijumptoes! They don’t do that!


Darmok and Jalad on the ocean.


I’m currently writing code to calculate the text displays based on the mOnProcessValue. It’s very specific to each mapping but I have figured out all but the Frequency mapping (It’s some sort of logarithmic mapping I believe) for the equalizer parameters. I’m doing this because mOnDisplayValueChange isn’t always properly called when changes are made from the GUI. Calling mOnValueChange seems to trigger mOnDisplayValueChange and it works, but as soon as I do that, my subPages stop working. So for now I’m brute forcing everything with mOnProcessValueChange.

I’ll take a look at your code and see if I can come up with other ideas.

As far as VSTs, I don’t see the hybrid approach as a real solution, especially for the Mackie C4 where you need the script interaction to properly update the displays. I’m hoping that that is just temporary and they come up with a real solution.


See above, the mOnProcessValue event tells you the value of the device when it changes, or the last value the device sent when the value in the DAW changes. I don’t think this is what you want.

mOnValue will tell you the value from the DAW, but it also significantly reduces the performance to the point that the DAW will stutter, so this isn’t a solution either.

MOnDisplayValue will give you the display value in text, but is only sent in special conditions. The best I could come up with was to know when the value on the device and in the DAW were different, but that doesn’t work unless you are moving the value in the DAW and not clicking once. See the VariableControler class for that solution. I don’t think it is what you want though. I tried to change the colour of the LEDs by the DAW value, but found there was no performant way to do that.

I believe that the reason the subPages or Page events do not work for you is that the events are bound to the mapping. I made them also dependent on the Page by always updating the “Encoder” Class when the Page changes.

I was hoping for some of this to be fixed, but it doesn’t appear to have been.

This seems to be working for me. Here is the code I have so far. The value I’m calculating and the display values, are in sync. But I have further testing and calculations to do.

    button.pushEncoder.mEncoderValue.mOnProcessValueChange = function (context, newValue) {
      var r = button.row.toString();
      var c = button.column.toString();
      var normalized = Math.round(newValue*126.99).toString()
      console.log("ProcessValueChange(".concat(r, ",", c, ") = ", newValue.toString(), ",  ",  normalized))
      var text = ''
      switch (button.row) {
        case 0:
        case 1:
          // Use VPotLedMode 1 for Gain
          // Calculate Gain
          text = ((Math.round(newValue*480)/10)-24).toFixed(1);
        case 2:
          // Calculate Q Factor
          text = (Math.round(newValue*120)/10).toFixed(1);
        case 3:
          // Determine Filter Type
          switch (this.eqPage) {
            case 1:
              if      (newValue < 0.1428571492433548) {text = 'P I   |'} 
              else if (newValue < 0.2857142984867096) {text = 'LS I  |'} 
              else if (newValue < 0.4285714328289032) {text = 'HP I  |'} 
              else if (newValue < 0.5714285969734192) {text = 'HP II |'} 
              else if (newValue < 0.7142857313156128) {text = 'P II  |'} 
              else if (newValue < 0.8571428656578064) {text = 'LS II |'} 
              else if (newValue < 1)                  {text = 'LS III|'} 
              else                                    {text = 'LS IV |'} 
            case 2:
            case 3:
              if      (newValue == 0) {text = 'P I   |'} 
              else                    {text = 'P II  |'}
            case 4:
              if      (newValue < 0.1428571492433548) {text = 'P I   |'} 
              else if (newValue < 0.2857142984867096) {text = 'HS I  |'} 
              else if (newValue < 0.4285714328289032) {text = 'LP I  |'} 
              else if (newValue < 0.5714285969734192) {text = 'LP II |'} 
              else if (newValue < 0.7142857313156128) {text = 'P II  |'} 
              else if (newValue < 0.8571428656578064) {text = 'HS II |'} 
              else if (newValue < 1)                  {text = 'HS III|'} 
              else                                    {text = 'HS IV |'} 

I do something similar in the “Track” page for the upper most knobs on the LCXL. It is a bit more abstract, as it uses a generic Controller class to accomplish the same sort of thing.

Change the value with your mouse in the GUI and see what happens.
I tested the latest release and I still see the same behavior. The Display value ONLY when you change the track, or when you change the device. It does NOT provide an event when the value of the target changes.

The Process value ALWAYS provides the last value sent from the device → Even when the value changed in the DAW <-.

So your code looks like it will work as long as you do not use the mouse to change the value, but it will not work if you change the track, or change the value with the mouse.

So far I’ve been changing it in the mixer GUI, where I highlight the values with my mouse and type in a new value. This has worked perfectly. I have not tried it on the track itself. I will give it a go.


Seems to be working fine for me here. At least the equalizer values. I change the frequency in the inspector on the track page and the mOnProcessValueChange responds accordingly. the mOnDisplayValueChange does not however.

1 Like

Wow, really? I wonder why it isn’t working that way for me. This could be very good news.

So, if you open the channel strip, and use the mouse to change the value of then mOnProcessValueChange newValue is provided with the value you set in the GUI on the channel strip?

My test was with moving knobs though, and not with the slope type, so maybe that is the difference.