Request: Put the MIDI Remote API Online so AI can see it

Attn Devs,

Please put the Cubase 14 MIDI Remote API on an accessible website for one main reason: So that A.I. can access it and not give us bad advice when we’re trying to ask it questions regarding Cubase scripting.

Currently, because the API is local on our machines, AI has no way to access and update its knowledge on the libraries, functions, methods, and properties.
The other day, it gave me code that made no sense, as the property is gave me didn’t exist.

Thank you for your consideration! It’s almost a no-brainer, really.

It is available online: Introduction | MIDI REMOTE API

MR is not the only project with which chatGPT does this. It can literally do it everywhere. In fact it’s quite comical :smiley:

2 Likes

I have seen plenty of that with languages and code libraries that have publicly been available for years and years. AI can be helpful to some degree but as an automated code generator, it’s not quite ready for prime time.

1 Like

Depends on our aims. 30 minutes ago I watched a chess match between chatGPT and Meta bots and it was hilarious, I surely call this prime time: https://www.youtube.com/watch?v=XqD86JzRaIE

I had seen that github link when I searched for it, but it seemed like either it was too new or it wasn’t being read by AI properly.
Well well well…I think I may have trained CoPilot with all of my info I was feeding it.
A couple weeks ago, it didn’t know what the mSurface property was, inside DeviceDriver, or if it did, it didn’t tell me to use it in a test example when it should have.
But now, I’m asking it if it has access to the API and it’s saying yes - it pointed to the same link that m.c gave - and when I ask it to list all of the properties inside DeviceDriver, it’s giving me mSurface, as well as the others as it should.

So who knows - it seems to be improving.

I think the AI folks misnamed this stuff when they called it a hallucination.

It often feels more like interacting with a person who bullshits their way through stuff they mostly kinda know, but…

I try to point out that something is wrong but not tell the AI what exactly so it has to figure it out.

2 Likes

I think they’re like a child with many data available yet to build upon :slight_smile:

1 Like

… except that children haven’t yet learned to busllshit to that extent.

It can be fun for minute until it starts going in circles and you realize it’ll be faster to just look it up the old fashioned way.
I’ve tried using GPT a fair bit recently for coding and I don’t think it presented me with fully functioning code once. What I did find useful about it was it gave me some new ideas and angles of attack I wouldn’t have come up with as quick on my own.
Give a few years though and there’s no doubt in my mind that AI will cut a sizable chunk of software development off the market.

1 Like

If we ever get to that point, it won’t be much longer until it cuts a sizeable chunk off the planet.

You should mark the post with the link by @m.c as “solution”, so the AI can pick it up when the next person looks for that API document :wink:

1 Like

Better still, mark mine as the solution, so that AI will become cynical enough
that its logic will ultimately drive it to destroy all of humanity in the interests of its creators.

2 Likes

What might be cool is to build our own GPT model and train it purely for Steinberg-related purposes.
I’m currently in the process of learning JavaScript (for scripting) and Python (for AI purposes) but I’m still in the beginner stage.
If one of you were so inclined, however, and wanted to create a GPT Model that uses a Steinberg-only learning model, perhaps it could give us all the answers we need going forward.
As it stands, we have to use the other models that were trained on different data sets that are out of our control, and do not represent our needs specifically. I say “our” needs, but I selfishly mean my own needs, haha. :wink:

I’m currently building an NRPN-based SSL console controller for the StreamDeck+ and that is why I could use all the help I can get right now. The controller I’m making, I plan on releasing to the community when it’s done.

So far, I’ve gotten some icons made for it, the encoder icons, and the script is working to turn the db value in .1db increments, from -15db to +15db on the standard SSL db knob (301 total steps, so NRPN must be used to get past the 128 step standard MIDI limit).

The part I need help with, and was hoping AI could help, is to get the knob to be velocity sensitive, so that I don’t have to turn the knob so many times to jump to higher values than .1db increments.

I’m currently using the MIDI plugin by Trevliga Spel for the Stream Deck (amazing MIDI plugin), while also trying to make it function properly in Cubase with the MIDI Remote. So A.I. has been a huge help in trying to help me piece the stuff together, especially as a noob.

Edit: I will make another post about my specific question regarding my velolcity sensitivity needs after I’m done learning the API for the Stream Deck plugin (I have to make that script, plus the Cubase one).