Generic remote midi input using plugins not just hardware.

I have been using remote devices\generic remotes for controlling tracks and plugin values.
It is very nice to literally get your hands on the mix.

I am finding that you can only use hardware inputs and outputs. I would like the ability to use project plug-ins such as Reaktor as the input. Kind of go virtual with it. This would be like a midi track using a plugin as an input. So I know the tech is there.

An example:
Reaktor could be processing data as an insert and controlling Nuendo through the generic remote. I have it working but I have to use a midi loop back and an extra midi track.
There is also one more issue were the generic remote is not processing when using non real time exporting.
So what I have to do is record everything down to a track and then export.

I see one negative and that is you can’t guarantee the project will have that plug-in. I use templates and I can guarantee I will have it loaded. There also can be more generic remotes mapped to different things, so if the plug-in is not there it just wont do anything. Not a big deal.

This is a new frontier and it would be very helpful if Nuendo supported it.

Thoughts?

If I may ask: What are you controlling with Reaktor?

I’m just trying to figure out what the parameter would be that you would control using a vsti/plugin instead of a hardware device…

If I remember correctly (I don’t work with VSTi’s that often) faster-than-realtime export won’t do “midi” devices and VSTi’s. So if you’re using a VSTi as a source then you won’t get it unless you do realtime… I think…

It is not really what value to control but how to control it.

Imagine having a Reatkor LFO (low frequency oscillator) driving the frequency of equalizer plug-in (any EQ you like).
Yes there already exist some plugins that do this but do those LFO’s randomize to be more organic? Can you choose different waveforms? Can you modify the speed of the LFO to your program material?
A lot of these items can be done in a synth or sampler but I want to be able to modify the host and not have to load files into a sampler, trigger, sync, process and then render.

Does that help describe what I am doing?

Yes, makes sense…

Great idea man

If I’m getting what you’re asking (and I’m still not quite sure!), the control of MIDI parameters within a plugin is up to the developer of that plugin. For example, at its simplest, Blue Cat makes a free Gain Stage audio plugin that can be controlled by MIDI CC and can “learn” what number it is. Reaktor can send MIDI out so you can simply assign Reaktor to the Nuendo MIDI IN channel that was created when you instantiated the Gain plugin on an audio track and then do whatever you want in Reaktor to control it. You can also assign an external input to a MIDI track in turn assigned to the same MIDI input and record live performance. On playback, either of these should render without going real time.

If you want to get clever, you could also control Reaktor parameters in real time from external MIDI, and record that to a MIDI track…

There are many plugins that offer MIDI control of their parameters and many that don’t; most do offer automation and if anything, I would love to see Nuendo able to map MIDI to that automation, making any automatable parameter MIDI controllable. IIRC, I think Fruity Loops already does this.

What am I missing?

@Breeze

Yes you can patch Reaktor midi to another plugin that has midi. You have to go through another midi channel and it starts to get confusing with live tracks and patch tracks.

What I am specifically driving through generic remote is the volume of a few effects and group tracks. I am also driving the Quick Controls of these tracks. Reaktor is always hard wired to these controls so that I do not have to patch when I am in the production mode. This allows me to easily insert a plugin, assign the quick control to the parameter I want and it is done, back to being an artist.

Besides having some set of rules in your project can be a good thing.
The key for me is to keep in the creative zone, if things get too complex (because the possibilities are there) you can spend too much time on the engineering side and snuff the creative side. As a sound designer I am battling this more and more so I am trying to design tools and work flows that allow me to keep in the groove.

I totally agree with you there and personally I find there isn’t a single DAW today that is geared towards sound design. We are constantly working against the constraints built-in by engineers purposing software the most common usage which is the creation of music and even then we are still fighting things that fall outside the range of popular and conventional music.

And you’re also right that the right tools are the ones that get you to your creative goal the fastest. I think the problem is always that engineers think like engineers; I’ve some experience with that and you have to keep a really close watch on what they’re doing because they don’t think like creatives. “Oh that’s easy: it’s a ten-step process. You do this, and this, and this…”

I have many ideas about this myself. Here’s a freebie: why does every track in DAW have to run at the same tempo? :wink:

Yes YES. I just got an answer to this sort of question.

This is a huge help although you have to jump back to the pool and set each segment. If there was a way to do this on a track it would be ideal. Maybe adding “tape” to the current time and tempo? Or adding the tape function as an optional right click.

Nuendo is on the right track. There is so much flexibility in routing and editing. I want to thank them for that.
Paired with Reatkor I am able to do some pretty interesting things. Maybe they can make Reaktor more integral like Max MSP in Live?

FYI, default warping options on import can now be set in preferences! Now I can set “elastique Pro - Tape” and not have to go to the pool all the time. Huge time saver.
file\preferences\editing\audio\default warping algorithm.

Thanks Steinberg!

Thinking about the original problem, what if the generic remote could use OSC as an input? At that point you do not need to patch through using midi and hopefully any OSC device could send to the generic remote inside of outside of Nuendo.

Good News,
I see that there is now an option for MIDI output generic remote data to a plugin insert and not just hardware. We are SO close to having both! :smile:

Can Steinberg also add this option to the MIDI Input, please? This would open up a lot of doors for sound design and procedural automation.
The image shows both Input and Output features and the missing software input.

MIDI input driven from a plugin would allow my voice processing pipeline direct access instead of using the shown 01.Internal MIDI loopback driver, LoopBe30.
Unfortunately we are having some issues with the LoopBe30 driver since a MS Windows 10 update last year, the driver is not seen using remote desktop and sometimes logged in directly. This is really bad news for our 200+ projects that rely on this. :expressionless:

Thank you!