External Instruments Render in Place Alternatives with kept Delay Compensation etc

Hello,

I’ve used External Instruments for a while, but I want to speed up my workflow meaning that I want to avoid dragging and dropping separate MIDI Events to the External Instrument Channel in Inspector, every time I want to Render-In place.

It’s a bit cumbersome, but at the same time, I want all the advantages of the External Instrument setup. Including the Delay Compensation.

My question: Instead of using the above-described Render in Place for the External instrument, will I get the same result, with a Delay Compensation, etc. if I actually let the External Instrument be played by a MIDI Track and then just return and record the audio output to the correct bussed Audio Channel!?

Has anyone tried this workflow in this manner?

Cheers.

That’s how we used to do it in the past so there is no reason you can’t do it now.

Of course, but that doesn’t answer my primary concern: What’s the difference?

If MIDI data is NOT placed on the External Instrument track, but a MIDI Track - is that the same as in the old days when we just had MIDI Tracks and Audio tracks!?

Options:

1) MIDI Track > MIDI Out > Hardware > (back to) Audio Track

2) MIDI Track > MIDI Device (loaded as External Instrument VST) > Hardware > (back to) Audio Track

3) MIDI Track > External Instrument VST > Render in Place

4) MIDI Track > External Instrument VST > Hardware > (back to) Audio Track

5) External Instrument VST > Render in Place

Which one of the above described gives the “best” result, or the same result as the number 5? In terms of Delay Compensation etc.? Actually, all the things that loading an External Instrument VST Track is solving.

Every scenario is the same? Render in place still sends the midi to the external instrument which then plays back in real time and the output is recorded to an audio track.
Every single scenario does the exact same thing as I just mentioned. It’s just different ways of looking at it.
The external instrument has to receive midi to play.
The external instrument has to output audio in real time.
Cubase has to send the midi.
Cubase has to record the audio in real time.

There should be no difference no matter how you dress it up.

That’s not what the Cubase Pro manual suggests. There’ a “lot” of features and things going on under the hood when using the hardware as External Instruments in Cubase. For example, delay compensation, the MIDI timing gets automatically calculated for eventual latency, hardware settings delay, return gain, etc.

The advantage of doing it in the “old way” is that input bus connections are saved with a project. On the other side, loading a bus setup from a Preset doesn’t add an Input Bus, it rather replaces all the current settings.

External Instruments Presets just add the Bus to the already present one(s).

That is the difference between the Input(s) Bus and the External Instruments Bus regarding that matter.

If all the above-mentioned solutions were the same, there should not be demand (or even reason) for Steinberg to develop the External Instrument as a Track channel in the first place.

One advantage I see over the old one is that Input Buss settings get saved with the project.

Honestly,

I fail to understand what slows your workflow down when using Render in Place on External Instruments. *)

I also think you are overthinking this, making it seem to be more complicated then it is.

*) I am using this method all the time, having gone away from individual midi tracks manually recorded onto audio tracks. So maybe I am biased.

The thing is that I must drag and drop 40-100 MIDI channels manually to and from the External Instrument Track every time it’s time for Rendering!

I had a project with over 100 MIDI Track channels to be rendered. Cumbersome and time-consuming as hell!

That’s the only reason I again consider going back to VST Instruments alone.
It seems to be an unsolvable equation, at least for my workflow concept.

I was confused by the external instrument track (coming from Cubase 6) as well.
My understand is that it is a convenient shortcut for one MIDI track playing on an external instrument with its audio loopback.
I am using 16 MIDI tracks on different channels sending to the same external instrument and that works well.
So there is no need to drag your MIDI pattern to an instrument track. Route their output to the instrument itself.

Your wording is a bit imprecise. I am not good with such “lose language”. I try to clear up your words for me:
You cannot drag a midi channel, that is physically impossible. You can route 100 midi tracks all to the same External Instrument, just in the same way that you can route 100 midi tracks to the same VSTi.
Your External Instrument can only process a certain amount of midi channels at the same time. Some instruments can only process one or two, many can process 16, some even 32 midi channels.
Most hardware instruments have no more than 4 mono audio outputs (or 2 stereo pairs). Maybe there are some that support more than 4 audio channels but surely not 100.

So, the restriction really comes from the hardware instruments, not from Cubase. And the workflow is the same, no matter whether you use External Instruments or (the old school way) midi tracks + audio tracks.

Of course it is faster and more comfortable to use virtual instruments in a DAW. That’s the reason why they became so popular after Steinberg invented them more than 25 years ago.
Did you think hardware synths are as elegant to use as VSTi’s?

How would you expect it to work. Nearly everything is dependant on the external synth and how many physical inputs you have. I can’t think of a way to make it simpler if you are re-using the same inputs all the time. The only thing relevant is the delay compensation which will come in useful if the external instrument has latency after receiving midi. That would not impact the actual timing though. If you get it wrong then it would be a matter of moving the part by the right amount. I was glad to get rid of external instruments as to record audio on a multitimbral synth it involved muting all but one track and playing back while recording.

I stick to vsti’s but if you don’t then it’s always going to be time consuming unless you have a huge amount of inputs and multitimbral synths with multi outputs.

No, actually it is if you want to utilize the ‘Render in place’ feature for the MIDI Events content on the External Instrument Track.
You can’t do ‘Render in place’ on the standard MIDI Track events.

I meant MIDI Track. MiDI Channel = MIDI Track.

I know, I explained already. I meant MIDI Events from MIDI Track.

That’s not the same thing considering my workflow. Per design, every VSTi Track can be ‘Rendered in Place’. Just like that, without drag and drop one to and from the External Instrument Track.

Exactly my point! So, unless you are Chris Rea and work with 4 channels only in your productions, then it is inevitable that a lot of MIDI Event drag and drop must occur.

As I already explained, it is not my point.

Well, you got a point there. I guess the “middle route” would be to use hardware synthesizers with built-in Audio Rendering. In that way, I could just export already created Audio Stems.

Now, that opens a totally new chapter for discussion, as I’m only using MIDI hardware that has a specific character (older machines) when audio rendering in the box was kind of an SC-FI :slight_smile:

Exactly (well almost) as it is working now. The only difference is that we should have the ability to fire up ‘Render in Place’ directly from a MIDI Track connected to External Instrument (Inspector >> Output Routing >> MIDI Devices >> External Instrument).

My feature suggestion would be that Cubase “sense” such connection automatically, allowing the MIDI Events on that MIDI Track to be ‘Rendered in Place’ through the External Instrument Track.

Now, at least for me, that would change the game drastically! To the better, of course.

Cubase External Instrument Tracks is a superb idea, and it works. This would allow for a massive workflow enhancement, especially for a workflow that requires efficiency.

Well, I just created a Feature Request thread, which I hope it’s not illegal. :laughing:

Ok, glad we could clear up the lingo. Now I understand better what you are talking about… and you are absolutely right.
Render in Place should work with connected midi tracks but it doesn’t. In this case it is better to not use an External Instrument definition but just the “classic” way, ie. normal midi tracks + one or more audio tracks.
Bummer.

1 Like