In my tests the MIDI Instruments have no audio capabilities as far as Dorico is concerned. Meaning you can’t get their audio back into Dorico, e.g. by selecting an audio interface input port. Any thoughts or plans around this?
Since Dorico has no means for recording audio, I’m not sure how this capability would help you.
Somewhere in the back of my mind, I recall someone (was it you?) wanting to use their DAW to supply sounds and manage them in the Dorico mixer. If that is the case, might Vienna Software’s VEP 7 let you manage external sounds to use them directly from Dorico?
While composing I need to create temps to share with the team, so I’m using VEP 7 as a backend which works great with Dorico’s audio export (I love it in fact, really simple and quick). However the MIDI instruments don’t render out as their audio goes into the interface analog in ports. I can hear it while working, but don’t really have a way of recording synced with Dorico, so yeah looking for an audio in port addition to those instruments.
Wasn’t me probably with the Mixer as I’m not (presently) using it. After the music’s written I need to get it into Nuendo, but I haven’t tested those waters much yet.
Do you need the parts kept separate, or would trying to feed your stereo (computer) audio output through something like Audacity work?
I just need a single wav file to share with the team and to temp in the game. I’ll have to record the synths separately and combine in the wav, which kind of defeats the point of a simple quick export.
There’s several issues here - firstly Dorico doesn’t have any kind of audio input or recording capability. Even if it did then the next problem is that it has no way of knowing where the audio produced by a MIDI device will end up. There’s no difference between plugging in a DX7 or running in VEPro. We don’t currently have a plan for this.
That’s a pity. In my case I had hoped by now to migrate from Sibelius to Dorico. I use alot of VSL instruments but also an external sound module which I need to route into Vienna Ensemble Pro to apply MIR reverberation. This is not a problem in Sibelius nor in Cubase where the FX channel can use the VE Pro Audio Input plugin to pass the audio into VE Pro server. The application doesn’t need to understand the VE Pro midi channel to audio mapping - i simply define the sound module audio return channels in Cubase’s input config and feed that into the FX channel. While I can use the VE Pro Audio Input plugin in the Output mixer channel this sends everything rather than the sound module.
I appreciate you wouldn’t want to start replicating large parts of a full DAW in Dorico.
I’d also like to be able to pull audio interface inputs into Dorico’s mixer in order to take advantage of the mixer effect inserts. Dorico is nice enough to give us direct access to MIDI devices as end points, and he comes with all sorts of REALLY NICE effect plugins on the Mixing Desk…too bad we can’t easily bring our big keyboard workstations and old rack instruments into that Dorico Mixing board for processing.
I have a couple of work arounds for Windows, but it’s pretty round-about and involves 3rd party software.
The first part of the solution involves some advanced ASIO drivers called ASIOlink Pro (now free) that allow one to patch your system’s audio routing. It’ll also allow pulling WDM (Windows Direct) audio drivers into the ASIO back-end (All those times you wanted to pull a browser stream or something into your DAW to sample or make a track but couldn’t find an easy solution…well here it is!).
Another option out there is jack2, though it’s harder to set up, isn’t as tightly coded/optimized for minimal latency, and isn’t readily as powerful; nor, does it pull in WDM drivers unless you also stack it with something like ASIO4ALL. It doesn’t have the intuitive and easy to use audio loop feature.
Once I have this flexibility to run multiple ASIO apps on the same device/driver, and also route the inputs and outputs of said apps anywhere I want, I’ll start a second DAW to host an outgoing instance of ReaStream, and serve as an avenue to get my audio interface inputs of choice streaming into the ASIO router mentioned above. Personally I like to use a stand alone instance of Bidule for this, which does a lot of other things for me as well with MIDI controllers before they hit Dorico…such as translate some OSC remote stuff from tablets into MIDI that Dorico can bind to commands; plus, provide some other real time transformations for my MIDI keyboard and wind jammer…so I’ve usually got a stand alone Bidule instance going well before I launch Dorico, Sibelius, Cubase, or anything else that involves MIDI or OSC controllers anyway.
If you don’t have Bidule, then substitute your ASIO compatible DAW or VST Hosting software of choice. It doesn’t have to be a big fancy DAW. The goal is simple. You just need something capable of getting the inputs from your audio card into an instance of ReaStream. Just make sure it’s linked up with an ASIOlink instance that uses the same Audio device/card as you’ll be playing Dorico through (so the master audio clock, buffers sizes, latency, etc. all are the same).
Back in Dorico, I’ll set his audio device to ASIOlink, connect his outputs to whatever my speakers play from, and run a receiving instance of ReaStream somewhere in Dorico. Personally, I like to run an instance of Bidule as a VSTi, and host a receiving instance of ReaStream here. If one doesn’t have Bidule, ReaStream should run OK in one of the effect inserts of the Dorico Mixer, and that provides a way host a few of Dorico’s effect plugins lower on the chain, plus whatever is in the FX channel if a send value is set.
Example (Using Jack2 rather and ASIOlink since that’s set up already on this rig…same concept):
Roland Fantom XR —> M-Audio Delta 1010 computer interface’s SPDIF input ----> Jack2 Router -----> Bidule (synced to the Fantom’s SPDIF clock through the Delta 1010) —> ReaStream Out ----> ReaStream In (Hosted in Dorico) ----> Dorico’s main output ----> Jack2 Router ----> Delta 1010 output for my amp/speakers.
When establishing my network connection with the two ReaStream Instances, I’m careful to use the loopback IP address of my system (127.255.255.255)…no sense in broadcasting it all over my LAN router. Since it’s internally broadcast only without actually going through any networking interface, it should be plenty fast enough on most systems to do a stereo pair with acceptable latency.
Caution…if you’re trying to use ReaStream directly from the Dorico Mixer…don’t try to do any native rendering to audio files with Dorico. Dorico doesn’t seem to send anything to external MIDI devices when exporting audio, but It could cause serious problems the way I do it with a Bidule VST, or even a crash if/when Dorico tries to use its silent/speedy rendering capabilities. Better yet, do your recording in real time from some other DAW (again use ASIOlink to route things from Dorico to whatever app or DAW you’d like to record with). Bidule has the ability to record for me as I have it set up. ASIOlink Pro can also make flac recordings.
Since I’ve been hosting the ReaStream input instances through Bidule hosted as a VSTi in Dorico for years now rather than trying to host ReaStream directly in a Dorico mixer effect slot (trick Dorico into thinking it’s simply talking to a VSTi plugin)…I’ve found it to be very stable (doesn’t crash if I forget and try to have Dorico render a wav/mp3/whatever).
When hosting directly in a Dorico effect slot, I have gotten crashes before. They occur if I ask Dorico to render a mixdown to Audio. Again, the catch here is that I do NOT try to use Dorico’s native render to audio features if I’m using one of these old rack MIDI instruments set up this way! Personally, I just put an instance of Bidule in Dorico’s main audio channel and record it from there in real time…or, ASIOlink Pro can record it, or…just route the signal over to Cubase if I happen to have an instance of that running as well.
Mac Users could probably do something similar, as that OS features core audio, which might be a lot like jack2 mentioned above…perhaps even better?
Dorico has audio in in the form of a VST. I’m not experienced with the API, but it has to be some form of streaming audio over a shared memory FIFO or something. However that works exactly, MIDI goes out to the VST and time aligned audio comes in which is send over the internal mixing bus.
With a MIDI instrument, the MIDI out portion is aligned with the rest of the MIDI streams. My thinking is that if an audio in port could be associated with that MIDI endpoint that would be all that is needed. My RED interface has 62 audio in ports distributed between mic/analog, ADAT, SPDIF and Dante. So, if I could associate one of those ports with a MIDI endpoint, then it should otherwise act similiarly to a VST with has MIDI OUT/audio IN. The main architectural change is that Dorico only associates the audio interface in the preference where it gives you a picker for the interface and which two ports. Here the interface input ports would need to be available in the Play mode.
This would be great, but of course I understand how busy you folks are with a million other features, but AFAIK it’s conceptually not a difficult problem.
You are missing the basic fact that your RED interface is not a VST device and knows nothing about VST.
For real time rendering it doesn’t need to know. Just have it so one can add inputs from the audio interface into the mixing matrix (By ASIO rules it’ll need to come in from the same audio driver/device Dorcio is using for output…so same clock, buffers, etc.). If those inputs make it onto the mixer, one could then apply effects, eq, balance, etc. in the Dorico mixer before it hits the mains. Whatever goes through the mains, Dorico should be able to render to audio files.
If a user with such external gear asks Dorico to render a score to an audio file check to see if any such audio inputs are active on the mixer. If so, ask the user if they’d like to render in real time with or without muting the sound to speakers in the process, or ignore those channels and do an instant render (which most virtual/software VST plugins can do since it’s all in the digital domain).
As it stands one has to maintain a separate mix for external gear, or do a dance to ‘hack’ it.
Exactly - that is precisely my point, which is to create a ‘virtual’ VST which combines any analog in port with a MIDI out. Both ports have time alignment, OK easy for me to say but I wouldn’t expect it to be terribly difficult either.
Another way to look at this is, what’s the point of a MIDI out if there’s no corresponding Audio in? Obviously because we can listen to the instrument via some other channel, but if we’re doing that, why not just bring that channel back into Dorico?
In my case I’m fortunate that I can do it at all, only because the RED has a number of internal hardware busses that I can program to combine with the two channel out of Dorico before going to my monitor controller. But it’s troublesome because it requires a special setup just for Dorico playback, and I’m missing my synth section from the orchestra when I have Dorico export audio. And internal bussing is a bit more pro, a lot of people don’t have the option with their interfaces.
- Roland Fantom XR rack mounted synth/sampler
- M-Audio Delta 1010 audio interface
- jack2 or ASIOlink Pro ASIO Routing Software free
- ReaStream Network Streaming Software free
- NanaoHost VST Plugin Hosting Software free
You need an ASIO router in order to get audio inputs into an instance of ReaStream. In this example I’m using jack2 for my ASIO router and having my Roland Fantom XR plugged into the Delta 1010 SPDIF inputs. jack2 routes it to an instance of NanaoHost.
In Dorico’s mixing tab I’ll run a receiving instance of ReaStream in a VST effect slot (In this case I’ve put it on the mains, but you could get creative and put it in a Halion Sonic SE channel or something that’s loaded in the VST rack but doesn’t have an endpoint assigned, and thusly be able to take advantage of the FX send channel, and have even more flexibility on any effects you’d like to add to the chain.
With this setup Dorico can NOT export MIDI device endpoints to audio! At best it just won’t play your MIDI device attached endpoints during rendering. At worst, It might cause a crash (depending on how you’ve got it set up, and with what software).
Instead, You’ll need to record in real time with another DAW (Must be ASIO compatible for Jack2, while ASIOlink Pro can route to almost anything). Use your ASIO router to set up the signal routing required. If you’ll be using a DAW that can assign audio devices to the mixing matrix and can also host VST plugins (Such as any version of Cubase), then you could skip NanoHost and host the outgoing ReaStream instance there instead.
You keep using that word. I do not think it means what you think it means. (The Princess Bride) I’m impressed with all you have tried, but it all falls apart for me if Dorico can’t render it.
If I’m down to recording in a DAW in real time, it seems like you can do the same thing simpler (for real) just sending the output from external devices directly to channels in the DAW.
Yes, that’s what I described.
Using an ASIO router to get audio from one app into another. You could also do that with hard cable patches…but…Dorico still has no way to natively bring your audio device inputs into his own mixer. That was the ‘feature request’ of OP. I’ve just offered a ‘work around’ to get it done.
Yes, one could do a separate mix for your external kit elsewhere, but using ReaStream brings it into the Dorico Mixer, where you can work from the same set of effect plugins, and ALSO mix VSTi plugins with your external kit.
One could also just not host anything in Dorico at all, and send all the endpoints to virtual MIDI ports, and then into a DAW that’s hosting the instruments and effects. Again, one would have to record in the DAW in real time rather than using Dorico’s native audio exporter.
The reason we can’t currently render to this sort of External kit with Dorico right now…is because Dorico is expecting real VSTi instruments, which are all in the digital domain…can work faster than real time, etc. With any old external kit that doesn’t come with new/special/modern VSTi interfaces, rendering has to be done in real time.
For now, Dorico doesn’t seem to be able to do this. When I’ve tried it, the stuff sent to MIDI devices doesn’t go…just the VSTi plugins hosted in Dorico. If I do it through a Bidule VSTi instance, Dorico doesn’t know it’s not a true VSTi…so things can get messy sounding in the rendered results (possibly crash…rare for me, but I’ve seen it before and chocked it up to trying to instant render with the patched in external kit like this).
So…if I want to use that old external MIDI kit for sounds…I mix down in real time, with some other app doing the recording (usually a simple bidule for me…one I made that I call Dorico Tracker).
I understand that the goal is to get export directly from within Dorico in the future, but in the meantime, couldn’t you double your mixdown channels on the RED and then just send those to a DAW in the meantime? Since focusrite has control software for their interfaces, it shouldn’t be too difficult to copy the signals you need and send them somewhere convenient that isn’t Dorico. Perhaps the largest caveat is the fact that rendering would have to be done in realtime.
Hm, that’s interesting let me think. Yes simple approach is there’s a loopback bus in the RED which I can throw a custom mix configuration onto to sum all the channels, Dorico’s and the synths. That would definately work but it would be realtime, which is fine anyhow since the synths have to be realtime. Downside is there’s no synchronization between Dorico and DAW so I’d have to do some fiddeling to get it to work, which kind of gets around the whole ease of use thing. But it would work, so thanks for the option.
One silver lining is that if you use Dorico to generate the playing (which is captured by the daw) and you also screen capture the performance, the audio and video should sync up perfectly; you’ll just have to shift the audio in post a bit to compensate for the delay, but that should be pretty straight forward.
If you really want to run other DAWS in full sync…
I’ve tried an LTC to MTC hack with Dorico and it works.
Drop an old fashioned SMPTE stripe into a video and have Dorico play that with his built in video player.
Run a line from whichever video channel has the LTC stripe into some sort of LTC to MTC converter. I have old black boxes laying around that do the job, and there are some software for PC/Mac options out there as well. Personally tested it with an old Anatek SMP7 MIDI patch bay that includes analog LTC to MIDI MTC features.
Personally, I actually keep a stripe to play with Bidule rather than using Dorico’s video player. I’ve built a tracker bidule that knows when to start and stop based on Dorico’s play status and the ASIO sample count. Once Dorico is rolling, my Cubase session’s transport locks right up with Dorico. Starting/stopping/cuing around in Dorico is no problem…the Cubase transport stays in sync.
Sadly, the reverse isn’t true. Dorico has to no way act as a slave to another DAW…but if I treat it more or less like the master…other things will lock up with Dorico’s transport.
If for some reason you can’t spare an audio output for a SMPTE track in your main video…if you have Bidule…that works. If not, you might could side host the SMPTE track (just bars, black, whatever, with the SMPTE audio channel fed to a spare output on your audio device) in something like VidPlayVST hosted from Dorico.
Believe me, I do understand that it’s a lot of hoops to jump through to Sync a scoring app with other stuff…but if one really want’s to use Dorico that badly in parallel with other stuff…it’s worth giving it a try.
That’s a pretty good explanation Brian, very useful. I gave it another try - lots of different tries really, hoping that I could find a way that we all might have missed. I ran across an interesting (to me) discussion of the inaccuracies of MTC, where they recorded a number of takes in Logic, Cubase, Reaper, and Pro Tools. It was small but noticeable to see how subsequent takes don’t line up exactly for the exact same config and DAW, and between DAW’s. I imagine that it doesn’t improve for a more complex routing and conversion solution from SMTPE to MTC.
My goal is to be able to change any given individual part(s) using Dorico after its been recorded in the DAW. Especially if an editor is relying on the overall timing/timing of other parts to be exactly the same as before. I might be using external midi ports to compose, but I think for me the best answer is to export as a MIDI file from Dorico when its time to record.
Although Dorico exports a single MIDI file, in Studio One at least the various individual parts are listed out and available to drag to the specific track. So I can update just one instrument lets say and be sample accurate in timing. It is an extra step - but maybe not more so than if you were recording in real time. And maybe a little better protected from some MIDI glitches I’ve seen in the past on my system.