I’ve been reading every reply but held off responding to let the discussion develop and accumulate enough substance for a single, thorough response. There’s a lot worth addressing here.
First: Correcting the Record
@butterfly is right to call this out, and I want to be straightforward about it: my original post overstated the limitations of the MIDI Remote API. It was @m.c who showed me — through actual working code — that Direct Access can do significantly more than I documented. The phrase “extensive research” in my original post was accurate regarding the competitive landscape (REAPER, Ableton, MCP ecosystem), but clearly insufficient regarding what Cubase’s own API can actually do today. That’s a fair criticism, and I own it.
I should have spent more time in the Steinberg developer forums and less time reading REAPER’s documentation before drawing conclusions. Lesson learned.
What m.c’s Code Actually Changes
@m.c’s post #16 is the most technically valuable contribution in this thread, and I want to make sure it doesn’t get lost in the noise. He demonstrated that through Direct Access you can:
-
Read parameter names on any third-party VST via getParameterTitle()
-
Read formatted/human-readable values (e.g., “2.4 kHz”) via getParameterDisplayValue()
-
Iterate all parameters of any plugin using getParameterTagByIndex()
-
Iterate all tracks with their properties (name, volume, mute, solo, routing, metering) via tag IDs
He also shared the full tag reference table for track properties (40 tags covering everything from automation state to channel configuration) and a utility for logging Direct Access objects via SysEx to an HTML viewer.
This is significant. It means plugin parameter access in Cubase is much closer to REAPER’s TrackFX_GetParamName / TrackFX_GetFormattedParamValue than I originally claimed. For my AI mixing assistant, this would cover reading EQ settings, compressor thresholds, and most plugin state — which was one of the seven requirements I listed.
Updated scorecard with m.c’s corrections:
| Requirement |
Original Assessment |
Corrected Assessment |
| 1. Read session state (tracks, FX chains) |
 |
Possible via Direct Access (since CB13.0.50) |
| 2. Read plugin parameters (names + values) |
(said “8 via QC”) |
All params via Direct Access |
| 3. Set plugin parameters |
Partial |
Via Direct Access |
| 4. Insert/remove plugins |
 |
Plugin selection doable in CB15 via DA (m.c promised snippet) |
| 5. Create tracks and buses |
 |
Still not possible |
| 6. Render audio regions |
 |
Still not possible |
| 7. External process communication |
(MIDI only) |
Still sandboxed, MIDI only |
Three of seven are now confirmed possible. That’s meaningful progress from what I originally presented. But requirements 5, 6, and 7 remain hard blockers — and those are the ones that make the difference between “scriptable mixer” and “automatable DAW.”
The Sandbox Is the Real Constraint
@butterfly joked that complaining about the MIDI Remote API only supporting MIDI is like saying a snare drum can’t play melody. That’s a fun analogy — but actually, a snare drum has a fundamental pitch and a harmonic series. Load a snare sample into a sampler that can transpose, and you can play any melody you want with a snare tone. The instrument’s limitation was never physical — it was the interface. Give it a better interface (a sampler with a keyboard), and it becomes melodic.
That’s exactly the point of this thread. Cubase’s internal capabilities are rich — m.c proved that. The limitation is the interface we’re given to access them. Give us a better interface, and Cubase becomes programmable.
@cubace made the key technical point in #60: MIDI as a data transport protocol is fundamentally limiting for programmatic control. It was designed for note data at 31.25 kbit/s. Even over USB, wrapping structured commands (create track, set parameter by name, query project state) into MIDI SysEx or CC messages is an awkward abstraction. HTTP/JSON, WebSocket, IPC, or even simple stdin/stdout would be orders of magnitude more natural for this use case.
@David_Nuttall demonstrated this problem concretely with his hybrid stack — virtual MIDI + SendKey + interpreted scripting + HTTP/WebSocket service — all built to work around a sandbox that doesn’t need to exist for programmatic use cases.
On DAWproject and Standards
@CKB raised an interesting point about DAWproject as a potential foundation. While I understand the logic — DAWproject defines a structured format for tracks, plugins, routing, and media — I agree with @Johnny_Moneto that it’s solving a different problem. DAWproject is a static exchange format (save → transfer → load). An AI assistant needs a live, bidirectional API (query state → analyze → modify → re-query). They’re complementary, not overlapping.
That said, if Steinberg ever builds a proper API, the data model they’ve already defined for DAWproject could inform the API’s object structure. The ontology exists — it’s the interface that’s missing.
On MickeyB’s Project File Analysis
@CKB highlighted MickeyB’s approach of parsing Cubase .cpr project files externally with Claude to generate plugin inventories and routing maps. This is creative and useful for static analysis (auditing projects without opening them, tracking plugin usage across projects). It won’t work for live mixing assistance (you need real-time state, not a snapshot from disk), but it proves there’s demand for programmatic access to project data — and people will find a way to get it, even if it means reverse-engineering binary file formats.
On “Steinberg Should Control AI Features Themselves”
Several people suggested that Steinberg should build AI features internally rather than opening an API. I respect this perspective, and @awesomeaudio articulated it well. But @MattiasNYC nailed the counterargument:
If you open everything up you have 1 Steinberg + X third party developers. So in practice it would probably mean more new features over any given time period.
And @djpat’s VST analogy is perfect: Steinberg invented VST. They didn’t try to build every synthesizer and effect themselves — they created a standard and let the world build on it. That decision made Cubase the center of a massive plugin ecosystem. An API would do the same for automation, scripting, and AI tools.
The competitive pressure is real. @MattiasNYC put it bluntly: Steinberg is “sandwiched between the cheaper alternative on one end and the professional industry standard on the other.” REAPER has had scripting since its early days. Pro Tools is expanding its ecosystem. Cubase’s competitive advantage has always been workflow quality — but workflow is exactly what scripting and AI tools enhance.
What I’m Doing Next
I’m continuing to build the REAPER prototype. The architecture is “tools, not workflows” — a thin orchestrator that gives the LLM 30+ tools (analyze spectrum, get masking, set EQ parameter, screenshot plugin GUI) and lets it decide how to solve problems. The model sees the mix, proposes changes, the engineer approves, and the system re-analyzes for cascading effects.
For those asking for a demo (@CKB and others): I’ll share progress once there’s something worth showing. Expect a CLI MVP within a few weeks — nothing flashy, but functional enough to demonstrate the concept: natural language → audio analysis → proposed changes → apply → re-analyze.
But here’s the thing I want to emphasize: thanks to m.c’s work, the Cubase version is closer to feasible than I originally thought. Requirements 1-3 (read state, read params, set params) appear to be covered by Direct Access. The remaining blockers (track creation, rendering, external communication) are real, but a meaningful subset of mixing assistance could work within the current constraints — especially if someone builds a bridge between the sandboxed JS environment and an external process (even if that bridge is MIDI SysEx, ugly as it would be).
If m.c is willing to collaborate on exploring what’s possible within the current Direct Access capabilities, I’d be genuinely interested. The worst case is we document exactly where the real walls are. The best case is we find enough capability to build something useful.
Thanks to everyone who contributed substantively to this discussion — especially m.c for the code, @cubace for the NDA API insight, @MattiasNYC for the strategic framing, and @philmac for the real-world perspective. This is the kind of thread that, if Steinberg pays attention, could actually inform their roadmap.