So I’ve been using Bonne Midi Translator to get an Icon Platform Nano to transmit button behaviour the way I want it to perform. All is working well so far.
I’ve assigned two buttons to step through tracks and I’m getting the motorised faders to snap to correct levels. I’ve also set up the four floating rotary knobs to control the EQ (Vpot 1 = gain, push is on/off,. Vpot 2 = Freq, push = Q) and works a charm. Great!
What I’d like to do now is read the track’s pan and level, then send that back to the Nano. To avoid getting caught in the Mackie 8 channel grouping, I’d like to send the current track pan and level on Midi channel 1 and always keep the Mackie controller looking at channel 1 (by not using the buttons on the Nano ).
If I can get my head around that code, it’d be good to transmit the time code and channel text names as well, again on Midi channel 1.
Did you write a script or did you use the GUI in Cubase, please?
I’ve used the GUI in Cubase. And am using Bonne Midi Translator as the in and out to pass through to the Nano.
I’ve been looking at the script the GUI in Cubase produces in an editor and am wondering if I can add some lines of code to get the extra transmit data.
Set the type to Jump, please.
Sorry, I’m not sure I understand. The Fader is already set to Jump to get the motorised part to work. I’d like to get the VU lights on the Nano to work with the level Aftertouch data Mackie uses to pass through volume levels. But only on Midi channel 1 for the current Cubase track, thus avoiding the Bank of 8 problem.
I’m afraid you would need to go the scripting way for this specific request.
So you can’t add script to a GUI script, even though the GUI method produces a script?
I would not want to edit the final thing in the GUI.
Are you sure? Does it produce *.js file? Or what do you mean by the script, please?
It produces a .json file. I’m now assuming that adding to that is not possible?
*.json file is not a script. It’s just a data file, something like *.XML file type.
Right here is where I would like to ask you to please provide a detailed set of instructions that explain to me how to go from…
A. Not having a MIDI Remote at all
B. Having a MIDI Remote .js file (and any necessary, accompanying .JSON file (auto-generated or otherwise)) which can output console.log() to the Cubase script Logging window
Example: .js console.log in the Cubase Logging window
Hi, saw this one already?
If you follow the instructions there and create, say, the “Simple Device” you should be at good point to start with.
Thanks, Yes, I was there quite a few times but did not get an explanation for the failure(s) in development I was having.
Looking there again on your suggestion I see…
var helper = require(‘./helper’)
Where (or what) is the ‘./helper’ file?
Of no importance for you getting started. It’s probably a file required by this particular script.
Here’s a snippet to get you going:
//Create a folder named intro inside the [Path of your Documents folder]/Steinberg/Cubase/MIDI Remote/Driver Scripts/Local
//Then create a folder named script inside the intro folder
//Create a file named intro_script.js and paste this code inside. Place this file inside the intro/script subfolder
// Requiring the API
var midiremote_api = require('midiremote_api_v1')
// Getting the instance of the device driver
var deviceDriver = midiremote_api.makeDeviceDriver('Intro', 'Script', 'Someone')
// Adding MIDI In & Out ports
var midiInput = deviceDriver.mPorts.makeMidiInput("midiInput")
var midiOutput = deviceDriver.mPorts.makeMidiOutput("midiOutput")
// Detecting ports
var detectWin = deviceDriver.makeDetectionUnit()
.expectInputNameContains('your midi input name or part of it')
.expectOutputNameContains('your midi output name or part of it')
// Getting the instance of our controller's surface (i.e. buttons/knobs/faders e.t.c.)
// Defining/designing our surface's controls
// Binding our surface controls to their MIDI Ports and Messages
// Creating a mapping page
// Adding bindings to our page (i.e. assigning commands and host values to our controls)
//Listening to an event
console.log("our knob changed its value to "+value+" diff from its previous value="+diff)
// Values are always normalized to the range [0,1]
// If we want them to the [0,127] range we can write:
// var value127=Math.round(127*value)
//Listening to display value changes
console.log("The display value of our knob changed to "+displayValue)
//Listening to title changes
console.log("We've just got title1 changed to "+title1+" and title2="+title2)
// title2 is not always available
// Listening to color changes
console.log("our color changed to (r,g,b,a)="+r+","+g+","+b+","+a)
console.log("active changed to "+active) //Boolean
Thank you very, very much! I’ll try today! =D
Thanks again for the script and the help. I got it working. =)
I think one oddity is that the logging panel won’t show any console.log output if the Mapping Assistant is open. There are also some actions that can be mapped and will work with the Assistant open but others need it closed.