Full Scripting API for Cubase - The AI Integration Gap Is Now a Competitive Threat

No. What took place here was a substantive discussion about AI, the necessary interfaces for it, and the arguments for and against it from the perspective of users and manufacturers. People don’t have to agree on everything.

And if you think Steinberg can develop all useful AI features that could meaningfully assist composers by themselves, without opening up to third party, then you’re completely delusional.

I didn’t say that. Just read what I wrote again carefully. When I say that Steinberg should retain control, that doesn’t mean that third parties should be completely shut out.

I personally rather see smart UI features. If I was to look for a AI integrated DAW I don’t think I would look to Cubase…

Good idea to open up the available features, though.

cheers

Why the hostile approach?

Huh? Sorry, I told you very politely that you should read the top part again, and now what are you accusing me of?

Well, I was the first one to reply in this topic, so apparently I have read the original post, and you can see that if you do what you ask me to do. Of course I take offence.

@Johnny_Moneto

I read the topic and replied to it with:
In response to the original point raised in this thread, I’d like to share my perspective…

You wrote in response:
I don’t see how your post has anything to do with the topic. *)

So you’re implying that I missed the point in my reply - and so me too: Of course I take offence.

What else am I supposed to say, other than that you should just compare the two again?

*) If you’d just kept that comment to yourself, it would’ve been better. Comments like that always end up cluttering everything up. Important: I didn’t mean that in a mean way, okay. :wink:

@Glorious

Referring to my post, what do you think of what Bitwig has already started - namely, introducing a standard for DAW projects? It can’t be that bad, since Steinberg has already jumped on board ?
Do you see potential here for integrating an AI interface with this format?

Here is the relevant section (DAWproject) from the Cubase Pro 15 online help:

I also asked you to clarify it for me. I do not rule out that I am the only one who is not seeing the connection of your post to the topic.
For me the topic is about using AI as an assistant operator. You talk about DAWproject, which is a project exchange format and has nothing to do with operating a DAW.

I just don’t get it.

2 Likes

I didn’t say that DAWproject has anything to do with operating a DAW.
I said: “Glorious, do you see potential here for integrating an AI interface with this format?”

That doesn’t mean, of course, that Glorious can use this to carry out his project.
Glorious mentioned the kind of data his AI would need, and in that regard, I see some overlap with the DAWproject format. This is significant insofar as it suggests that, in addition to Bitwig and Cubase, other DAWs might be capable of handling this as well, that might be the case, but it might not - we’ll see…
Of course, to enable interaction between external AI and the DAW, further steps (API functions, etc.) would still be needed. We’re not there yet, but to get there, we first need a standardized data format so that an external AI can work with any DAW. We need a non-native data format that has the potential to do that. Data standards are the best way to ensure that APIs are implemented - in other words, to motivate developers to do so.

“What does that have to do with the topic?” you might be asking again.
I think it’s part of any discussion to consider all aspects in order to assess whether we’re at least heading in the right direction, even if we haven’t reached the goal (= any AI for any DAW) yet.

1 Like

I think you are completely mistaken with that train of thought. DAWproject is a project exchange format and I think we do not require a standard project file format at all.
There already is a solution for Reaper and another one for Live as per @Glorious ’ post. Live does not support DAWproject, the two DAWs have no significant common standard, yet there are solutions for both of them.

1 Like

@Johnny_Moneto

DAWproject is a project exchange format

There are no specific rules regarding how to use a format or how it might be used differently in the future.

we do not require a standard project file format at all.

Of course we need that, and Steinberg agrees.

There already is a solution for Reaper and another one for Live

So if one DAW can do something, does that mean other DAWs don’t need to be able to do it too? Surely no one would agree with that.

We do not require a standard project file format at all for what is proposed in this topic.

Your reply has nothing to do with my line of argumentation and therefore I won’t discuss it here.

Right, it could become a standard on how to grow bananas.

You simply repeat your opening statement or thought. You haven’t fed it with any point despite that was what I asked of you.
I respect you for your work on midi remote scripts elsewhere in this forum but in this case I think you are just off topic and derailing this thread.

1 Like

We do not require a standard project file format at all for what is proposed in this topic.

Now you’re contradicting yourself — and contradicting Steinberg as well.

Your reply has nothing to do with my line of argumentation and therefore I won’t discuss it here.

That wasn’t an argumentation, but rather your way of insisting on defining exactly how far it’s permissible to expand on the topic. I’ve noticed your condescending attitude in this forum before, and also that it’s mainly your stubbornness in similar situations
that leads to digressions from the topic. And then you end up spouting nonsense like

Right, it could become a standard on how to grow bananas

And on this subject:

You simply repeat your opening statement or thought. You haven’t fed it with any point despite that was what I asked of you.

No, I haven’t repeated myself at all; rather, I’ve gone out of my way for your sake to explain my reasoning to you in simpler terms.

I respect you for your work on midi remote scripts elsewhere in this forum

Thanks

but in this case I think you are just off topic and derailing this thread.

The same goes for other posts here. But it seems you only criticize this when it comes to users you specifically target. I’ve noticed that quite often here in the forum.

But that doesn’t really matter, since I can see that many of your posts on the forum provide valuable input for everyone. I really respect that.
Given this mutual respect and understanding - since we clearly have different views on when we’ve gone off-topic and when we haven’t - we should put an end to this back-and-forth now.

1 Like

I think what you’ve accomplished is really interesting!
Please let us know more about it once you’ve made further progress.

1 Like

As long as there’s an AI that can tell when music has been generated by AI, there’s hope.
I think I’m still capable of doing that myself, but I’m a little afraid that one day an AI might be better than me.

I think it’s silly to claim that an API, which Steinberg has named “MIDI Remote,” is limited simply because it only supports communication via MIDI. That sounds like saying a snare drum is limited because you can’t play a melody with it.
:laughing:

Steinberg thinks that’s so important that DAWProject Import/Export is available even on the LE and AI versions in 15 when that was an Artist/Pro only feature in 14.

MIDI can of course transport any data. But there are better ways to transport data. And there are real poor handling of midi in this context. Real midi is 32 kbit/s data signalling. That requires good flow control and priority handling. But steinberg is not using real MIDI. They assume that everyone is using USB to transport MIDI that can do that much faster and better than real MIDI. But if you require USB there are much better ways to handle communications than MIDI.

3 Likes

But it wasn’t an “extensive research into the MIDI Remote API”, because it was only thanks to m.c.'s helpful guidance that Glorious came to understand what the MIDI Remote API is actually capable of. It’s a bit strange that Glorious got so much likes for his incomplete research.

Still, thanks for his work. It helps me understand why Reaper is so important today.
:+1:

1 Like

I’ve been reading every reply but held off responding to let the discussion develop and accumulate enough substance for a single, thorough response. There’s a lot worth addressing here.


First: Correcting the Record

@butterfly is right to call this out, and I want to be straightforward about it: my original post overstated the limitations of the MIDI Remote API. It was @m.c who showed me — through actual working code — that Direct Access can do significantly more than I documented. The phrase “extensive research” in my original post was accurate regarding the competitive landscape (REAPER, Ableton, MCP ecosystem), but clearly insufficient regarding what Cubase’s own API can actually do today. That’s a fair criticism, and I own it.

I should have spent more time in the Steinberg developer forums and less time reading REAPER’s documentation before drawing conclusions. Lesson learned.


What m.c’s Code Actually Changes

@m.c’s post #16 is the most technically valuable contribution in this thread, and I want to make sure it doesn’t get lost in the noise. He demonstrated that through Direct Access you can:

  • Read parameter names on any third-party VST via getParameterTitle()

  • Read formatted/human-readable values (e.g., “2.4 kHz”) via getParameterDisplayValue()

  • Iterate all parameters of any plugin using getParameterTagByIndex()

  • Iterate all tracks with their properties (name, volume, mute, solo, routing, metering) via tag IDs

He also shared the full tag reference table for track properties (40 tags covering everything from automation state to channel configuration) and a utility for logging Direct Access objects via SysEx to an HTML viewer.

This is significant. It means plugin parameter access in Cubase is much closer to REAPER’s TrackFX_GetParamName / TrackFX_GetFormattedParamValue than I originally claimed. For my AI mixing assistant, this would cover reading EQ settings, compressor thresholds, and most plugin state — which was one of the seven requirements I listed.

Updated scorecard with m.c’s corrections:

Requirement Original Assessment Corrected Assessment
1. Read session state (tracks, FX chains) :cross_mark: :white_check_mark: Possible via Direct Access (since CB13.0.50)
2. Read plugin parameters (names + values) :cross_mark: (said “8 via QC”) :white_check_mark: All params via Direct Access
3. Set plugin parameters Partial :white_check_mark: Via Direct Access
4. Insert/remove plugins :cross_mark: :warning: Plugin selection doable in CB15 via DA (m.c promised snippet)
5. Create tracks and buses :cross_mark: :cross_mark: Still not possible
6. Render audio regions :cross_mark: :cross_mark: Still not possible
7. External process communication :cross_mark: (MIDI only) :cross_mark: Still sandboxed, MIDI only

Three of seven are now confirmed possible. That’s meaningful progress from what I originally presented. But requirements 5, 6, and 7 remain hard blockers — and those are the ones that make the difference between “scriptable mixer” and “automatable DAW.”


The Sandbox Is the Real Constraint

@butterfly joked that complaining about the MIDI Remote API only supporting MIDI is like saying a snare drum can’t play melody. That’s a fun analogy — but actually, a snare drum has a fundamental pitch and a harmonic series. Load a snare sample into a sampler that can transpose, and you can play any melody you want with a snare tone. The instrument’s limitation was never physical — it was the interface. Give it a better interface (a sampler with a keyboard), and it becomes melodic.

That’s exactly the point of this thread. Cubase’s internal capabilities are rich — m.c proved that. The limitation is the interface we’re given to access them. Give us a better interface, and Cubase becomes programmable.

@cubace made the key technical point in #60: MIDI as a data transport protocol is fundamentally limiting for programmatic control. It was designed for note data at 31.25 kbit/s. Even over USB, wrapping structured commands (create track, set parameter by name, query project state) into MIDI SysEx or CC messages is an awkward abstraction. HTTP/JSON, WebSocket, IPC, or even simple stdin/stdout would be orders of magnitude more natural for this use case.

@David_Nuttall demonstrated this problem concretely with his hybrid stack — virtual MIDI + SendKey + interpreted scripting + HTTP/WebSocket service — all built to work around a sandbox that doesn’t need to exist for programmatic use cases.


On DAWproject and Standards

@CKB raised an interesting point about DAWproject as a potential foundation. While I understand the logic — DAWproject defines a structured format for tracks, plugins, routing, and media — I agree with @Johnny_Moneto that it’s solving a different problem. DAWproject is a static exchange format (save → transfer → load). An AI assistant needs a live, bidirectional API (query state → analyze → modify → re-query). They’re complementary, not overlapping.

That said, if Steinberg ever builds a proper API, the data model they’ve already defined for DAWproject could inform the API’s object structure. The ontology exists — it’s the interface that’s missing.


On MickeyB’s Project File Analysis

@CKB highlighted MickeyB’s approach of parsing Cubase .cpr project files externally with Claude to generate plugin inventories and routing maps. This is creative and useful for static analysis (auditing projects without opening them, tracking plugin usage across projects). It won’t work for live mixing assistance (you need real-time state, not a snapshot from disk), but it proves there’s demand for programmatic access to project data — and people will find a way to get it, even if it means reverse-engineering binary file formats.


On “Steinberg Should Control AI Features Themselves”

Several people suggested that Steinberg should build AI features internally rather than opening an API. I respect this perspective, and @awesomeaudio articulated it well. But @MattiasNYC nailed the counterargument:

If you open everything up you have 1 Steinberg + X third party developers. So in practice it would probably mean more new features over any given time period.

And @djpat’s VST analogy is perfect: Steinberg invented VST. They didn’t try to build every synthesizer and effect themselves — they created a standard and let the world build on it. That decision made Cubase the center of a massive plugin ecosystem. An API would do the same for automation, scripting, and AI tools.

The competitive pressure is real. @MattiasNYC put it bluntly: Steinberg is “sandwiched between the cheaper alternative on one end and the professional industry standard on the other.” REAPER has had scripting since its early days. Pro Tools is expanding its ecosystem. Cubase’s competitive advantage has always been workflow quality — but workflow is exactly what scripting and AI tools enhance.


What I’m Doing Next

I’m continuing to build the REAPER prototype. The architecture is “tools, not workflows” — a thin orchestrator that gives the LLM 30+ tools (analyze spectrum, get masking, set EQ parameter, screenshot plugin GUI) and lets it decide how to solve problems. The model sees the mix, proposes changes, the engineer approves, and the system re-analyzes for cascading effects.

For those asking for a demo (@CKB and others): I’ll share progress once there’s something worth showing. Expect a CLI MVP within a few weeks — nothing flashy, but functional enough to demonstrate the concept: natural language → audio analysis → proposed changes → apply → re-analyze.

But here’s the thing I want to emphasize: thanks to m.c’s work, the Cubase version is closer to feasible than I originally thought. Requirements 1-3 (read state, read params, set params) appear to be covered by Direct Access. The remaining blockers (track creation, rendering, external communication) are real, but a meaningful subset of mixing assistance could work within the current constraints — especially if someone builds a bridge between the sandboxed JS environment and an external process (even if that bridge is MIDI SysEx, ugly as it would be).

If m.c is willing to collaborate on exploring what’s possible within the current Direct Access capabilities, I’d be genuinely interested. The worst case is we document exactly where the real walls are. The best case is we find enough capability to build something useful.

Thanks to everyone who contributed substantively to this discussion — especially m.c for the code, @cubace for the NDA API insight, @MattiasNYC for the strategic framing, and @philmac for the real-world perspective. This is the kind of thread that, if Steinberg pays attention, could actually inform their roadmap.

2 Likes