X-Touch Midi Remote Cubase 12

No comment :wink:

1 Like

The problem is the MCU protocol itself, it doesn’t cover some features needed for modern control surfaces. (Surround panning,
The X-Touch is a cheap copy of the original MCU (with some extended functionality for X-32/M-32/XR scenarios)

Damn! :grinning:

1 Like

Just tried using as a slave over the network (LAN). Works as expected. It might be something to do with my laptop being an M1 and USB comms issues. Actually seems to be working better than before.

Just a short comment on the actual X-Touch device.
Keep in mind that you CAN set it up to work as a “normal” MIDI controller as well. At least the single fader version, X-Touch One can do this, so I’d imagine the bigger brother can as well.

On the X-Touch One you hold down the clicky encoder while powering up. Then the LCD promps you to select a mode, and instead of MCU you can select generic midi controller as well (the manual has a section on this, and I can’t quite recall what it’s called…I think there’s MIDI as well as “User” modes).
In the MIDI mode you should be able to map it up the way you’d think in the MIDI Remote Editor.
Mind you, I run mine with Mackie Control mode, so I haven’t given it much time. But I can’t see why it shouldn’t work.
As far as opening a plugin UI from the controller, you can open vst instruments by assigning the Track (or is it called vstrack?)->Selected->Edit… that should open the selected instruments UI. I set it up on other midi controllers just fine.

There are limitations in the MIDI Remote implementation that prevent the comfortable use as a mixer remote.
The MCU protocol addresses exactly these limitations, but still introduces limitations itself.
As @Jochen_Trappe already pointed out, it is just the beginning.

1 Like

Oh okay. Maybe I have to play about with it and see. I think I’ll keep mine in mackie control mode, as that’s really doing everything I want right now.

I am able to use it now after setting it up with the old method

1 Like

This worked immediately. Thanks

I do look forward to being able to use the capabilities of midi remote module to enable even additional capabilities of the X-Touch, which I’m pretty sure will eventually be the case.

English, please… English part of the forum.
A German topic to similar issues is this one:

The only gripe I have is the lack of “autobanking”
Curse you Mackie!

1 Like

The faders issue is annoyomg but easily fixed. When using the auto add feature (touch a control to discover and add to the map) the fader is recognised twice. Once when you touch it and once when you move it. You simply need to delete the fader associated with touch. I was in Ctrl mode on the XTouch for this. In this mode you can create a template and every button, fader and pot is recognised.
However there are still issues when adding fuctions. I found there is no LED light feedback to turn lamps back off and strange behaviour across the channel button selct, mute solo. F1 to F8 work nicely, transport works as expected, bank and channel buttons working, zoom buttons working. Most of the other buttons I had not tried yet as got hung up with the channel buttons not behaving correctly. The unit does appear to work as normal in MCU mode but think you need to choose legacy which is going to be removed at some point. This is from memory as away from unit so probs have missed or incorrectly stated some issues?

NOPE, not true. The only Sysex data used by MCU, is to type and drive the display and a initial handshake (so that the DAW knows, what device is connected). EVERYTHING else is nearly 40 years old plain simple basic MIDI. Please stop telling mysteries, that are simply not true.

While the API might be a nice thing for freaks and coders, it is useless if you cant do Java. The worst thing is, that you will be dependent to other people that understand Java. So the communication to solve problems will be like “hey, i have hardware i would like to use in Cubase.” “Oh sorry, i do not own your hardware, i can not really help.”

It might be the beginning for the Midi Remote, but musicians and artists expected a more advanced Generic Remote and nothing less. Users want to be able, to setup a midi-implementation chart like its covered in EVERY good hardware manual and not on a abstract layer stuff like Java. This is ridiculous to expect that from a novice/normal musician/artist. Find me a single manual, where midi-implentation expects Java language.

At least now via the API there exists the ability for new MCU implementations from coders, that musicians can build on top of,. Previously MCU was locked tight behind a sealed door.

Personally, I see no way around requiring coding knowledge and going through an API for something more complex such as MCU implementations. Yes, it’s relatively simple MIDI Data that is being transmitted - But it’s not so trivial to format and respond to different keys being held, banks and mode being adjusted and screen feedback updating as a result.

It’s near impossible for SB to put this into a simple interface that caters for so many devices. The work BOME do is a product in itself, for example. And even that requires scripting.

I’ve been working on Komplete Mk2 controllers and have a lot of the screen feedback running, and while there doesn’t exist the MCU/Remote integration where you can scroll through devices and read multiple pages of parameters and names of sends etc. “Hopefully” it will come to MR in future if we push for it.

In fact - It’s really important that these features become available, right now we have some of the generic remote functions, but we need that deeper knowledge of plugin slots, names, send slots, names, wet value “Per track”.

If it existed today, I could basically have MCU style integration running on a Native Instruments controller, without any kind of virtual ports or translations like BOME. With the added advntage of displaying long track names, parameter names and colour coding.

I can already get pretty close, though, and this is early days.

I’ve not tried it yet, but hopefully there’s a way to read MIDI implementation data that the API could utilise. As long as that JS script can read a local file and parse it’s contents, then anything is possible - as you could have an MCU implementation that’s user customisable based on what’s expected in, and what’s expected to return.

In fact, at the very least you could have a simple .js file with constants that the end user could update - and that would 100% work as you just make a reference to ‘include’ it within the main .js script.

i.e. It would be a simple list such as this.:

const MIDI_SYSEX_BEGIN = [0xF0, 0x00, 0x21, 0x09, 0x00, 0x00, 0x44, 0x43, 0x01, 0x00];
const CMD_PLAYBUTTON = 0x10;
const CMD_RESTART = 0x11;
const CMD_RECBUTTON = 0x12;
const CMD_COUNT = 0x13;
const CMD_STOPBUTTON = 0x14;
const CMD_LOOP = 0x16; 
const CMD_UNDO = 0x20;
const CMD_REDO = 0x21;

So from that basic .js file all you’d need to do is enter the HEX/MIDI CC addresses for each control element, and the developed API script would run with it. That in itself could make the need for virtual ports and third party translators un-necessary in many instances.

Within the API only those "const"s are referenced, so no translation needed. It becomes a native, customisable mapping based on an API engineered integration.

…Or maybe I’ve totally misread the point you were making!? lol :slight_smile:

JavaScript isn’t Java…

Hm… maybe I did too much simplifying things here… it was an answer to the request of plugin control with the MIDI Remote implementation and to show parameters mapped to controls dynamically would involve “SysEx” messages.

As already mentioned, the status of the MIDI-remote is just the beginning here.

If you are in the MCU mode on the device, with MIDI-remote, at this time, it is not possible to perform some tasks, like switching fader banks or accessing plugin lists. At least, I didn’t find a solution.
And I’m sure this will change in the near future.

But for now, the MCU functionality is still available in Cubase.

Sort of related to the discussion, I wanted to see how to make use of 14-bit in the new midi remote thing.
I wrote a quick hacky thing in python for my X-Touch One in mackie control mode to convert pitch bend (which mcu uses for the fader for higher resolution control) and it worked surprisingly easily. Volume of whatever selected track in Cubase was sent to the fader instantly.
Unless we get to map pitch bend on the controller thing, at least this method should be viable in javascript as well?
I’m not really that familiar with it yet, but this tempts me if it’s somewhat straight forward to do the same thing with the controller api.

Getting no luck with my X-touch.
When I add MCU there is only the option to use xtouchINT for input and not out?

You are correct, JavaScript isn’t Java. My point is still valid. Without JavaScript knowledge, you can’t use the API properly and it has not much to do, with „traditional“ midi-implementation charts found in manuals.

Sysex on MCU is only to display text on the display of the MCU. I don’t see why sysex would be a hurdle for anything told here.

It’s not a hurdle, but it’s not implemented yet…

Indeed, it’s not a traditional implementation.

@Jochen_Trappe can correct me if I’m wrong, but the idea could be,
to build a new, very flexible system that enables many kinds of implementation.
A vendor can provide a script for the use with Cubase/Nuendo. And the user is still able to alter the functions provided from the vendor.

As already more than one time mentioned, this is just the beginning.

Sysex is available of cause. just use midiOutput.sendMidi(…)

That’s correct!

Is your X Touch calibrated correctly? On my X Touch One, 0db on Cubase is 1.6 db on the fader. So annoying.