MIDI Remote - Defects & Lesson Learned

Yes, he wrote this, but he also wrote this:

And since,
PluginOnValue
PluginBypassValue
PluginEditValue
are the only functions that make use of mOnDisplayValueChange in the API index, it means as a result, that mOnDisplayValueChange is a unsupported feature (currently).
Can you agree with me on that point? Or i am totally dumb here?

Hello :joy:, a bit too fast man!
mOnDisplayValueChange is available on all SurfaceValue-objects:

knob.mSurfaceValue.mOnDisplayValueChange

That might be true, but if you do a search for mOnDisplayValueChange on API index, you will only have results for what i wrote previously.
It is true, that i will find mOnDisplayValueChange in the Example scripts, but it is not documented anywhere in the API index, except those i mentioned.

Please do a search for mSurfaceValue too in the API index :grin: :joy:
Tell us the result. :grin:

Not funny :face_with_monocle:

1 Like

No it is not, i am afraid.

Hi,

The bug is very nice described here.

I will not say anything here anymore and just STFU, if you can confirm that this is really a API bug. It could also be a bug, how we use this function. And to be honest, i am not 100% sure about this. You are?

Nothing towards your person at all.

The point being that, if the MIDI Remote was based on MCU, you’d expect it to be able to actually emulate or run as an MCU. But it can’t.

i.e. it’s not able page through parameters, select specific insert slots, or any number of things that MCU protocol allows for. There’s not even a standard jog control, you have to create it yourself via triggers to define forward and reverse actions based on direction turned.

I appreciate that you don’t accept what’s being told to you, but can only answer truthfully to what you’re asking.

The other thing that SB don’t want to be actively accepting (Due to it being a paid feature), is that the MIDI remote is almost in a public beta stage right now. So there will be oddities that don’t make sense, fixes required, functions missing, documentation errors, example errors - which is clearly frustrating you. As it is a lot of us.

But that’s a whole other topic to whether it’s based on MCU or not.

4 Likes

Hi skijumptoes :slight_smile:

This is a relief to me. I guess you are a native english man. If so, please avoid such phrases and metaphers. This is a multi-language community and i am not familiar with such phrases. Since there was no answer from you yesterday, it was the icing on the cake and really tilted me hard.
So again, in your own interest, avoid things like these.

That was a lesson i learned yesterday. I did a mistake to mention MCU and revised my statement. If you just replace MCU with “the vendor/company that was first to have a display and this display is supported by Cubase”, my previous posts would make more sense.
I dont care which company it is. My main point with that, was to explain that there are many things, where Steinberg dont need to reinvent the wheel, as it was done before since decades. It would be dumb to not use this resources and build the API completely new.
That is why i wrote “based” and not “tied” to MCU. Sadly every other person yesterday, jumped on to the MCU waggon and not to the thing i tried to explain.

Thats why i wrote that the API is capable of doing the same things like MCU, but not all. Some stuff is missing, but not to much.

You all said yesterday, that the MCU (omg again) uses standard midi communication. For this communication, the “hooks” to Cubase where made long time ago. Why you would create new “hooks”??? I would use these “hooks” from the past and put them one by one into the new API. Because API is not different from standard midi-communication. The API index is nothing more than these “hooks”. That is why i see so many, many similar things to MCU (or other vendors).

It is just sad, that the API index is documented like $hit in first place. Which only leads to heated discussions, where people hairsplitting, if API has similarities to MCU (or other vendor).
Since its poor documentation, people start to brew their own soup with their own scripts, resulting in code that reads different EVERY time and for EVERY script.

It is very easy to imagine, how that would look, if the documentation would have been readable and nicely explained, instead of putting something out that looks like the current state of API.
ALL scripts would be looking somehow similar (you dont reinvent the wheel, if you are smart).
Sadly, this opportunity is over and leaves “normal” people way behind, as they cant do something alone with the API, except to poke with it a little bit and spend precious time with trivial things.

Sadly, this pretty much dumb approach, splits the community, into people that are frustrated (like me) and to hyper-active fanboys that jump in and tells us (frustrated) people, how wonderful the API is working on their cheap-$hit controllers with absolute encoders (probably not having even a display). This is the point where i speak to myself “hey fanboy, you could do this decades ago with Generic Remote for your cheap-$hit controller. You simply dont need a new API, to do that. So piss off.”

Sadly, the hype around API is more worth, then crying people that cant get a simple LED to work, even after months. Dude, a LED… months… puking. How can i tolerate that hype? How is it possible, that a hyped API does not offer fundamental things like these?? No, you need to write it for yourself. SAD.

As long as you can create a jog control with JavaScript API code, it means you can not tell it is not possible, it just need work to do so. SAD.

It’s an English language community and you should make an effort.

3 Likes

This is a forum in English. If you’re not familiar with certain words or phrases, look it up. You can’t ask other users to adapt their language based on your level of understanding. That is ludicrous.
Also, Mr. Skijumptoes did not say anything offensive.

1 Like

English is hard. I’ve made an effort to learn 7 languages in my life, and even though English is my first language, it is by far the most difficult.

@u-man, if you need help with code, then post it. Fork the GitHub branch and push your code and post the link. Many on this forum will be willing to help.

I needed help at first and got a lot of help from many kind people here on the forum including Jochen , Martin, Skijumptoes, Nico5, MarcoE et. al.

Post your code, and let us figure it out together.

That’s kinda my main issue with where the MIDI Remote is right now. It’s fantastic at more immediate ‘friendly’ mappings. But as soon as you want to delve a little deeper you have to jump into the API.

Now, I always imagined the API to be a vendor (Manufacturer) facing environment, so expected there to at least be full array of existing functions from the MCU and/or the generic remote off the bat - because they are what most generic devices are using currently.

Placing a message into the generic remote sections that it’s going to be discontinued and recommend to use the MIDI Remote as an alternative - ok. But when the MR is not able to perform those tasks previously -SB Are then pushing people down a vendor API Route, so no wonder the confusion.

It feels that this needs another year of beta testing to me, the documentation and implementation is messy, and most simply bolts on top of what is already there - hence the restrictions already in place. - Perhaps this ‘bolting on’ to what’s there already is what you term as ‘based on MCU’? If so, I understand the point you’re making.

We’re still left with this restrictive 8 parameter VST Quick Controls, Track Quick Controls and now Focused Quick Controls system - The repeated use of ‘Quick Controls’ that applies to multiple elements is just weird, and the refusal to allow us any kind of paging system within them is just maddening.

It should’ve been written from the ground up, and then it could’ve included the ability to manipulate events, MIDI note data and colour coding from instruments such as Groove Agent so that devs could utilise that for pad and key RGB. And also mapping to plugins on a class level, not slot/track.

In fact, That’s what I was expecting it to be. Not tied to current control layers, but a direct route to DAW events and plugins.

Also, as many have called out, the API is based around the ES5 revision of Javascript which is quite antiquated (old) now, and I think that only adds to the frustration when you consider that this already just feels old and backwards to code within.

I was very happy with the immediacy of the creation of a script, and the mappings assistant within Cubase itself, but it feels that getting a visual representation of the controller in Cubase was a higher priority than actually considering finer detailed elements such as multi-page mappings, focusing/opening insert slots, mapping single cc jog wheels, encoder scaling etc. - Most of which we were doing beforehand with generic/mcu devices.

Maybe there’s more expansion to come in the future with MIDI 2.0+VST3 only, but in terms of genuine improvement, I feel that they’ve gone down the wrong road here - Through fear of disrupting what was there already, I expect.

But sometimes you just gotta be brave, and believe in the project and make it better than what you’re ripping out. As I see it, there’s just more being piled on top. (IMO of course!)

2 Likes

I agree. I will say this though, as a seasoned computer scientist that has often been afforded the ability to choose my projects, it reminds me a lot of the sort of project that I would jump into with the the same intention, only to learn that the reality of the legacy code presented unforeseen challenges.

That is why I now fully embrace the idea of Evolutionary Design. What we have now is truly amazing. I suspect that a lot of hard decisions were made in order to include MR in the 12 release. There is plenty of room for improvement, but other than a few minor inconveniences, I was able to convert my GR mappings (which took days to create), into UI created MR mappings in minutes.

If the MR environment continues to evolve, provides an ES6 API, adds the features from GR that are missing, provides a two way event handling that does not create lag in the DAW, allows MIDI to be sent back to the device as chosen by the script writer, provides a pass through mechanism to allow for direct CCs (or even notes) on one page, but mapped controls on another, and allows for paging of focus quick controls, among other things, I am sure we will find more to be desired.

If the MR features fall to the wayside and go a decade with no changes, and then are replaced by something new, then it would be very disappointing. But that doesn’t seem to be what is happening. I only wish I could dig deeper into the code to see what is happening underneath and help the thing evolve. That’s a good thing because it means that I am inspired.

Now if I could only get inspired to finish some of these tracks…

2 Likes

You got it! Yes, if that’s the roadmap brilliant. Plus whatever MIDI 2,0 brings of course. :slight_smile:

I think what we have today is pretty much it though, I can’t see how they’re going to bring in specific parameter elements into the API for us to bind to. Or the selected track elements from GR, such as channel strip elements, even when they change slot/position.

There’s also no appreciation of why users would want to page through quick controls, which is concerning.

Unless the plan is to create the hardware layer via API, And then add the rest via the mapping assistant. Exporting that out as the final script?

If that is the expected method then it’s me that’s completely missed the boat on this one! As i’ve been trying to map as much as possible within the API and need to turn my thinking upside down. and only use it for definig the hardware.

But then, there’s elements you have to bind in the API to be able to translate the expected values correctly. And once you modify that hardware control it then becomes non-generic in the mapping assistant. :confused:

2 Likes

I did. It did not help much. I still had and have issues to understand the whole sentence and the only feeling that i had is, it did not sound nice to me. Thats why i wrote, that i dont understand it.
If then there is no answer (from him) and i already had not a good feeling about it, how would you feel?

Is it really important for you, to read phrases (like theses) that makes more questions then answers to a already complex thing like discussing API functions ???
For me it is not important. It is not contributing to anything and it is quite the opposite… irrelevant.
If you really want that kind of stuff… no problem, take the L.

What is the problem with you all? I am familiar enough to talk about API stuff and i gladly look up things that i dont understand, but phrases like these have no point, if people (like me) will not understand it as the whole or cant find a clear definition for this. Also see above, take the L too.

No one of you both, could explain it either, but have the balls and nerves to write such a comment? seriously?

Dude. I don’t even know what phrase you are referring to. Is it “the elephant in the room”?

My point is, you can’t tell people to adapt their language because you can’t understand certain things.
If there was something in a post you didn’t comprehend, why not just ask what they meant?


:laughing:

2 Likes

I’m hoping this is wrong, but that is what I am trying to do with VSTis and the most recent changes. You have to know what you are going to map to, but as you state, anything interesting requires the mapping to be known in the code.

I still don’t see the mOnValueChange issue fixed either. That isn’t a solution because it causes extreme lag in the DAW. You can’t use it to move faders, or change LEDs.

It may work for @Ron.Garrison though, but he would need to add page specificity to the event handling. (Ron take a look at how I did that here: https://github.com/oqion/midiremote-userscripts/blob/main/Novation/Launch_Control_XL/Novation_Launch_Control_XL.ts)

The way I see it, the first thing to solve is that everything is MVC like looking in one direction only, and that direction is the “Surface” as the View. But from the script writer’s perspective the View is the device. This happens a lot with real-time telemetry bases systems like flight control or mission control. And it isn’t even just UI related. It also effects what is “on-thread” and “scheduled” differently as well.

But that stuff is probably easy enough to resolve, just a lot more work to add a thread pool or whatever going in what would seem like the wrong direction. The real hard stuff, I suspect, is buried in where to put the hooks in to the legacy code.

Of course, without looking at the real code, I’m completely clueless.

1 Like

Oh man, I didn’t mean you were the Elephant @u-man.

The Elephant is that MR isn’t able to match the ability of MCU, which is the bigger issue being missed in that conversation on whether it was ‘based’ on MCU.

If that still doesn’t make sense I’ll go give myself a good slap, then you can feel good again! Your English is great, by the way. :+1:

1 Like

No one is trying to not be nice. You are a welcome and valued member of this forum.
That isn’t what you will get from Skijumptoes! They don’t do that!

Edit:

Darmok and Jalad on the ocean.

2 Likes