I’m a developer and Cubase 14 Pro user. Over the past few weeks I’ve been building an AI mixing assistant - a local agent that analyzes audio, detects frequency masking between tracks, suggests EQ/compression changes, and executes them in the DAW through natural language commands. Think of it as an intelligent junior engineer that holds the entire session context in memory and helps you iterate through mix problems.
I started this project targeting Cubase. After extensive research into the MIDI Remote API, I had to abandon Cubase entirely and switch to REAPER. Not because REAPER sounds better or has a better UI - but because Cubase literally cannot be controlled programmatically at the level required for modern AI-assisted workflows.
I’m writing this post to lay out the technical reality, show what the competition already enables, and propose what Steinberg could build to close the gap. This isn’t a wish list - it’s a competitive analysis with specific technical requirements.
The Current State: MIDI Remote API Limitations
The MIDI Remote API (introduced in Cubase 12, currently v1.1) was designed for hardware controller mapping. It runs sandboxed ES5 JavaScript inside Cubase’s process and communicates exclusively through MIDI ports. Here’s what it can do:
-
Transport control (play, stop, record, cycle)
-
Mixer operations (volume, pan, mute, solo, send levels)
-
Channel strip parameters (gate, compressor, EQ, saturation, limiter)
-
Plugin parameters via Quick Controls (limited to 8 parameters per focused plugin)
-
Key command execution
-
Value callbacks for parameter read-back
Here’s what it cannot do - and this is where the problem lies:
-
Cannot create tracks, buses, or sends
-
Cannot insert or remove plugins programmatically (the Direct Access workaround requires parsing XML config files and breaks whenever plugins are installed or removed)
-
Cannot read full project state - no way to iterate over all tracks and their properties
-
Cannot access send bus routing
-
Cannot manipulate MIDI events, audio regions, or arrangement
-
Cannot query project structure (track count, track names, track types)
-
No HTTP, WebSocket, or file I/O - completely sandboxed
-
No network access - cannot communicate with external applications except through virtual MIDI ports
This means an external application must impersonate a MIDI hardware controller via virtual MIDI ports (IAC Driver on macOS) just to send basic mixer commands. Structural operations like creating tracks or inserting plugins are simply impossible through any supported API.
The Competition: What REAPER and Ableton Already Enable
Here’s a direct comparison of API capabilities:
| Capability | Cubase (MIDI Remote API) | REAPER (ReaScript) | Ableton (Live API) |
|---|---|---|---|
| API function count | ~50–100 | 900+ | 200+ |
| Scripting languages | ES5 JS (sandboxed) | Lua, Python, EEL2, C++ | Python, Max/MSP |
| Create tracks | |||
| Delete tracks | |||
| Insert plugins by name | |||
| Remove plugins | |||
| Read all plugin parameters | 8 via Quick Controls | ||
| Set plugin parameters | 8 via Quick Controls | ||
| Get plugin parameter names | TrackFX_GetParamName) |
||
| Get formatted param values | TrackFX_GetFormattedParamValue) |
Partial | |
| Iterate all tracks | |||
| Read project structure | |||
| MIDI event manipulation | |||
| Audio region manipulation | |||
| Render tracks/regions | |||
| External process control | MIDI bridge only | python-reapy via TCP) |
|
| Network/HTTP access | |||
| MCP servers available | 0 | Multiple | Multiple |
The gap is not incremental - it’s structural. REAPER exposes 18x more API functions than Cubase, and critically, it allows external applications to control the DAW over TCP without the MIDI bridge workaround.
What’s Already Being Built on Competitors
This isn’t theoretical. AI-DAW integration is happening right now, and Cubase is being left behind:
REAPER:
-
DAWZY (presented at NeurIPS 2025) - an open-source AI assistant that converts natural language to reversible ReaScript actions. Achieved 100% task success rate with GPT-5 on production tasks. Published academic paper with user studies.
-
reaper-mcp by itsuzef - a comprehensive MCP server with 25+ tools for project management, track operations, MIDI composition, mixing, and mastering.
-
reaper-mcp-server by dschuler36 - MCP server for project analysis and audio mixing feedback.
-
reaper-mcp-ai-analyzing - MCP server with AI-powered audio analysis, supporting 25+ tools for REAPER control.
-
Multiple community MCP servers on GitHub, all built on top of REAPER’s rich ReaScript API.
Ableton:
-
ableton-mcp by ahujasid - MCP bridge with tools for track creation, instrument loading, MIDI clip generation, transport control.
-
ableton-11-mcp by cafeTechne - 220+ MCP tools including music theory generators and agentic workflows.
-
Active community building AI integrations through Live API + Max for Live.
Cubase:
- Nothing. Zero MCP servers. Zero AI integration projects. Not because no one wants to - because the API doesn’t allow it.
What I Was Trying to Build (And Why I Left Cubase)
My project - a local AI mixing assistant - requires these core capabilities:
-
Read session state: list all tracks, their names, types, routing, and FX chains
-
Read plugin parameters: get all parameters of any plugin (not just 8 via Quick Controls), including parameter names and human-readable values
-
Set plugin parameters: modify any parameter on any plugin
-
Insert/remove plugins: add an EQ to a track, remove a compressor
-
Create tracks and buses: set up routing, create group buses
-
Render audio regions: export a Time Selection of specific tracks for spectral analysis
-
Communicate with external processes: send analysis results to an LLM, receive commands back
REAPER’s ReaScript API covers all 7 requirements natively. Cubase’s MIDI Remote API covers only a fraction of #2 and #3 (limited to 8 Quick Control parameters), and none of the others.
I had to switch to REAPER. This is a user that Steinberg lost not because of sound quality, workflow, or features - but because of API limitations.
What Steinberg Could Build
In order of impact and (estimated) implementation effort:
Option A: Full Scripting API (Best, Most Impactful)
A ReaScript-equivalent for Cubase - a comprehensive API accessible from Python/Lua/JavaScript that can:
-
Enumerate and query all project objects (tracks, buses, sends, plugins, MIDI events, audio regions, markers)
-
Create, modify, and delete project objects
-
Read and write all plugin parameters (not just Quick Controls)
-
Render audio regions programmatically
-
Be accessible from external processes (TCP/IPC, not just MIDI)
-
Maintain backward compatibility as features are added
This is what REAPER shipped in 2006 and has expanded to 900+ functions over 20 years. It doesn’t need to be 900 functions on day one - even 100 well-chosen functions covering tracks, plugins, rendering, and project structure would transform what’s possible.
Option B: Expand MIDI Remote API (Medium Effort)
Keep the existing architecture but add critical missing capabilities:
-
midiRemote.getHostValue('project/trackCount')- project structure queries -
midiRemote.createTrack(name, type)- structural operations -
midiRemote.insertPlugin(trackIndex, pluginName)- plugin management -
TrackFX_GetParamName/TrackFX_GetFormattedParamValueequivalents - full plugin parameter access -
HTTP/WebSocket listener - allow external applications to send commands without MIDI bridge
-
File I/O - read/write files for data exchange
Option C: Official MCP Server (Fastest Path to AI Integration)
Ship a Cubase MCP server that exposes DAW operations as Model Context Protocol tools. This is the direction the industry is heading - MCP is becoming the standard interface between AI models and applications. Anthropic created it, and it’s now under the Linux Foundation. REAPER and Ableton already have community-built MCP servers. An official Steinberg MCP server would:
-
Immediately enable AI assistant integration
-
Not require changes to Cubase’s internal architecture (it’s an external bridge)
-
Position Cubase as AI-forward rather than AI-behind
-
Give the community a foundation to build on
I’m Ready to Contribute
I’m not just requesting features - I’m offering to help build them. I have:
-
Deep research into REAPER’s ReaScript API, Ableton’s Live API, and Cubase’s MIDI Remote API
-
A working prototype of an AI mixing assistant (currently targeting REAPER) with audio analysis, plugin vision, and iterative mix problem solving
-
Experience with MCP server development, Python, TypeScript, and audio analysis (Essentia, pyloudnorm)
-
A detailed architectural plan for a Cubase AI assistant that I had to shelve due to API limitations
If Steinberg provides even a basic expanded API or an MCP framework, I will build and open-source a Cubase MCP server and AI integration toolkit for the community. I suspect other developers feel the same way.
Summary
Cubase has world-class audio quality, a mature workflow, and a loyal user base. But it is falling behind on programmability - and in 2026, programmability is what enables AI integration. Every month that passes, more AI tools are built for REAPER and Ableton, and the ecosystem gap widens.
The request is specific and technical:
-
Expose project structure (tracks, routing, plugins) to external queries
-
Allow full plugin parameter access (all parameters, not 8 via Quick Controls)
-
Enable structural operations (create tracks, insert plugins) via API
-
Allow external process communication (HTTP/TCP, not just MIDI)
-
Support audio region rendering via API
These five capabilities would transform Cubase from a closed system into a platform — and let the community build the next generation of tools on top of it.
I’d love to hear from Steinberg and from other developers who share this need.