I know when I’ve asked before that the response is “Why would you want to send a Command message to a controller?” and if that’s what I was using GR for, I would agree. The fact is, GR has the potential to do so much more if only Steinberg would complete its implementation.
Sure, I can add a Command (e.g. Add audio track) and I can set the Receive and Transmit flags, but only the Receive functionality works. It makes sense, too - press a programmable function button on your controller and add a new audio track, but I want to send that message via the Transmit flag as well as receive it. This simple addition to the way GR works would mean so much to my PhD research and give Cubase/Nuendo a headstart on real-time remote DAW collaboration and control over the Internet, since I’ve only focused on Cubase in my research and ignored all other DAWs.
If you are wondering what my research is about, I’ve got a technical paper on my work that’s just been approved for publication in an upcoming issue of the Journal of the Audio Engineering Society - I’ll update this post when I get a publication date if there’s any interest from the Forum.
In a post-COVID world, working remotely is more important than ever. Steinberg could really corner the market with VST Connect, VST Transit, and what I’ve got in store…
That suffices for a PhD degree? I mean - I am really curious… so I am looking foreward to reading your publication. I am mainly interrested in the scientific aspect iow of inhowfar we can talk about “science” in these areas at all. Please dont get me wrong, this is not criticicm, but I am studying in the field of science theory with a philosophical background and so these meta-topics are of great interrest to me.
No, my research involves the larger scope of the move to online collaboration methodologies in music production/performance/education and particularly the impact of COVID-19 on studios/recording musicians and educators. The Cubase portion specifically identifies a potentially new way to collaborate through DAW platforms, as opposed to real-time audio streaming as an adjunct to what is already out there. Steinberg has really been at the forefront of music collaboration - first with VST Connect, first with VST Transit, helped develop ReWire…
Unfortunately I’m just one person from Australia, so when I’ve contacted Steinberg for assistance in opening up the functionality of the Generic Remote, it doesn’t create much of an impact.
I do recall seeing this, though it had slipped my mind, so thank you for reminding me!
This sounds very much like they’ll be implementing MIDI 2.0 protocols - the features that Matthais talks about in his post closely describe the new capabilities that MIDI 2.0 will provide. Considering that Steinberg is a platinum sponsor of The MIDI Association, and a member of the MIDI Manufacturers Association if I’m not mistaken, it would be great to see them adopt the MIDI 2.0 paradigm.
In case you’re wondering what MIDI 2.0 can do, see this:
Well, you’d hope that the API developer/homebrewer would deal with MIDI 2.0 protocols as required. As long as Cubase allows the hooks in/out of the API, then ‘Anything’ could technically be possible as you can interpret it as you wish and forward it to the hardware.
So, technically, in the API you would set up a script that would respond to a parameter change and forward that change out to hardware as CC/SysEx/Xml/Datastream - whatever.
I think where MIDI 2.0 plays the biggest excitement (for me) is in regards to .xml data being shared at a plugin/device level. I don’t know if we’re too early in the development for that to be implemented though. As until plugins and such like have panel/parameter information to share it’s hard to develop for.
One things for sure, it’s going to be a whole lot better than we have right now. Even if they allowed mapping of parameters across multiple encoder/fader pages, and you can focus the hardware to each plugin/instrument instance - that would be a great first step.