How to deal with motorised faders

If you’re using a midi controller with motorised faders, like the Behringer X-Touch, BCF2000 or Mackie MCU, you have to manually set the takeover-mode to “Jump” all the time you make a new mapping in the Mapping Assistant.

Reason: Fader mappings will be automatically set to “Scaled” mode because 99% of all midi controllers have unmotorised faders and jump mode is very counterintuitive when using them.

I see a good chance that we add a special “(TouchSensitive)MotorFader” surface element in a future version.


Hi @Jochen_Trappe

Sorry to bother you but you talked about an incoming “(TouchSensitive)MotorFader” surface element several times here on the forum and although I’m sure you have good reasons, I was a bit disappointed that it did not came in version 12.0.30.

Is there any chance that it would be implemented in the near future?

The reason i ask is because i’ve been playing with the API recently and I’m getting to the point where I could switch from MCU + generic Remote to the API in a not so distant future, but for me touchSensitive Fader is mandatory.



Hi @thomas_martin, there will be a solution for motorised faders in a Cubase 12 maintenance update. But as you might imagine, there are a lot of topics we are addressing. I know you don’t wanna here it but for the moment please use the “good old” Mackie implementations because most motor-fader-devices speak “mackie”.


1 Like

Hi, and thanks for chiming in,

Don’t worry, me and my team are not quite ready to use Nuendo 12 in production yet. (our templates are old-fashioned and are suffering with the end of VST2, but that’s another story).

So i have time to play around, with the new API. I have quite a big DIY controller to map and so far I’m really pleased with this new feature.


1 Like

Please share a photo of that :smiling_face:

1 Like

There you go,

These pictures are a bit old, I’ve updated the touchscreen part of it (was using lemur, now switched to TouchOSC).

The Hardware part, is based on the great MIDIbox project, (

The touch screen is used to display and control the selected channel strip.
I should make a video of it once i completed it.
In the future, I plan to add another pair of TouchScreens for full display and control of 24 channel strips…


Wowsers @thomas_martin that’s some setup you have! Awesome.

Those scribble strips are being powered by MCU I guess? What a cool project.

Thank you @skijumptoes !

As of today, the controller emulates 3 MCU plus one Generic Remote. So the small Oleds screens above faders are displaying MCU Sysex from Nuendo. The other above encoders on the right side are handled by the controller itself.

I also have a HUI mode for compatibility with He Who Must Not Be Named. On this mode, however LEDs aren’t working (because HUI sucks). But I don’t really use it so i don’t mind…


1 Like

That is very kind of you @Jochen_Trappe .

Truth is this project is a constant learning journey to me (which i like). Before I started this project i knew nothing about electronic or programming (and I still don’t know much to be honest).


I was wondering what the status on this is? I still have to hijack the presonus faderport protocol to use touch on my DIY motorfader, which is limiting the possibilities. The possibility to use touch data would open up a lot of possibilities for DIY controller creators.

Wow, wow, wow!!! A M A Z I N G!!!

Has support for motorized faders/potis been added in the 12.0.50 release?

No - I wish it was!!

I just presumed converting the 0-1 float value into a MIDI message that the fader responds to would work for faders?

That works fine for me in my BCF200 implementation. I implemented it in JS for a variety of reasons, but the motorfaders worked fine in the GUI builder.

The LED rings were the tricky bit as the message to control them has nothing to do with the knob they relate to, but faders just worked as expected.

If the fader listens for pitchblend on a channel, just call this when you make each fader.

function makeFaderDisplayFeedback(fader, channel) {
fader.mSurfaceValue.mOnProcessValueChange = function (context, newValue, oldValue) {
var loVal=newValue&0x7f;
var hiVal=newValue>>7;

        midiOutput.sendMidi(context, [0xE0+channel,loVal,hiVal])


1 Like

Should we narrow it down a bit after a year or so from the initial launch?

Currently, as of 12.0.60, I can use a motorized fader ok1., but there one or two details.

First of all, to be totally clear, I will describe how this motorized fader is created in the surface editor.

  1. Create a new fader. For the midi message, click learn, start moving the fader around and select THAT message. (Pitchbend, CC, whatever.) Assign Mixer Bank Channel 1 Volume to it.
  2. Create a button. For the midi message, click learn, and just touch the fader. Select THAT message for the button. Assign Mixer Bank Channel 1 Write Automation to it, momentary mode.

So, by touching the fader we engage write automation, which allows us to freely move the fader without fighting already written automation. Using different automation modes (To Start, To End etc) we can manipulate data as needed to fill gaps created by the lack of touch functionality.

The details:

1. No “Touch” as a concept. We don’t have a “Touch” command that we can assign, so that we throw a channel to Write, touch the fader and write a continuous value of the fader.

What happens instead is that we abuse the “temporary Write” button, so that we can stop fighting any written automation that is currently Read. As long as the fader moves, automation will be written (punch in). But, when we stop moving the fader, shortly after a Punch Out occurs, because the fader stops sending the current value.

So, we must either get a dedicated “Touch command”, so that automation modes and Punches work as advertised, or the faders should never stop sending the same value over and over, no matter the value. (which probably is an awful idea).

2. Jumpy motion of the faders when reading automation, but that could very well be my unit, so I don’t really know if something’s definitely wrong there. (Faders do receive pitchbend, but the actual movement is CC values (1, 127), so it’s probably expected to have jumpy motion for just 127 possible physical fader positions.

So, at this point in time, April 2023, do you have different problems than the above (mainly point 1.) with motorised faders when using the Surface Editor ? (no javascript)

It seems to me that you should have the auto-latch enabled for this. Or maybe you mean something else?

Auto-Latch mode keeps the last value send from the fader on UnTouch, until the automation punch out.


This is the image from the manual.

With Write enabled, we touch the fader at Punch In postition, lower the fader, let go of the fader at position Un-touch. Write is still going on for that last value of Un-touch. When we punch out of Write, the value returns to the initial value before the punch in.

For this to happen we need a “touch awareness” function/command/I don’t know the proper word. Unless I have been blind for a year, I cannot see such a thing in the Mapping Assistant.

What we can do though, to “get the job done”, is mis-use the Write Automation button in a momentary state, so that we constantly punch in and out of automation (regardless of mode) and just record the moving faders value.

Now. I wish this is all pilot’s error on my part. But I also wish to get a discussion rolling, beyond “this is crap” “doesn’t work” “works fine” “works for me”. Which aspect of the whole motorised fader workflow doesn’t work fine? Which aspect of the whole motorised fader workflow works fine? I am sure that there are people in here with vastly bigger experience than mine, that could update us all on the situation, so that we can know, depending on our demands of the automation system, which things can work in the current situation, and which can’t.

What would you wish to do right there? Should it stay at your last fader’s value? But then, what’s the point of punch-out?