Making a sequence on one machine, that will ultimately be performed on different instruments, in different settings IS PUZZLING.
It is an ‘engineering puzzle’ that involves a lot of ‘lower level techy type stuff’, but you can’t put it together if you don’t know where pieces are stored, and explore/practice ‘putting them together’.
I’m not sure what ‘Maybe just a far better person [slight smile]’ is supposed to mean. Despite being an ‘inferior person’, I’m simply trying to contribute something that might be helpful to the OP, and perhaps the community at large.
I wasn’t sleeping well after a long day of work when I responded to OP, and drafting these posts upon occasion is one way relax physically while also exhausting my brain enough to actually crash for a few hours. I woke up ‘grumpy’ and ‘oversensitive’.
I can simply and directly explain how to make a stave, or even different voices on a stave render a SMF to the specific channels the OP asked for. I could even ‘do the work for him’ and supply a project/template all ready to roll.
Problem with that in my mind, is that wouldn’t walk through the concepts of how and why it works, and where I went in Dorico to build them. While it might answer a few immediate questions, it wouldn’t be good enough to accomplish the goal of regularly building scores that easily render into a useable MIDI file that’s going to ‘sound good’ off in another instrument, in a different setting.
So, I shared the sort of generalized exercise that mentors and professors put me through decades ago. Example Objective Rubric: Use a given set of tools involving 5 or more tracks/staves to render a MIDI file conforming the General MIDI I format. The resulting file should at least use Program Change events to initialize the instruments, trigger the initialized sounds with note on/off events, involve at least 2 forms of dynamic control, use at least 2 continuous controllers, establish a staging and spacing for the performance, incorporate at least one instant tempo change, at least one gradual tempo change, and demonstrate at least two methods to achieve a crescendo or decrescendo effect. (Etc…)
So I went a little further than OP asked, but not much. A tad off the mark with talk about PC changes on purpose, and I did it with GM framework, and a symphony instead of a few organ ranks on purpose, and I did it with an ‘external instrument’ first, and then a virtual one later on purpose, and here is WHY.
Mainly to give a brief tour around Dorico. Go here to change an end point’s actual transmission channel. Go there to tweak a global dynamic curve. Go over yonder to establish if you want velocity, a CC, or both controlling dynamics. Turn to yet a different page if you want to setup or ‘change’ sounds/channels ‘on the fly’. Need a reference plugin? Dorico comes with some…have a quick tour of one of them (Sonic).
I don’t think that simply having Dorico render the notes to the proper channels is going to be ‘good enough’ to just walk into the sanctuary or theatre, pipe the file into the console, and get good results right away. If OP is not building the score while sitting at the instrument, with real time access to the instrument, dynamics can be all over the map in some rather performance wrecking ways.
When composing away from the actual target instrument(s), some kind of ‘reference’ plugin or instrument needs to be mastered and understood. A very simple general MIDI player with very predictable and familiar banks of sounds is a good place to start. Exploring the various options and event types that Dorico offers will be paramount to understanding more than how to make a MIDI file that ‘plays back the notes on the right channels’.
Having a ‘reference point’ of some kind to listen while composing away from the instrument matters. Even if it’s so simple as just a solitary flute, oboe, trumpet, some various mixes of these three base timbers, and maybe a clean sine wave synth sound of some sort for the bass pedals.
Having some kind of clean and predictable reference on playback will help.
Understanding how the rendered MIDI file it’s going to translate on the actual instrument will take a little practice.
I felt that demoing the concept of setting up a Symphonic score that’d work and sound similar in any GM player would answer the direct question on understanding where to go to get Dorico transmitting on the correct channels. It’d also touch on a lot more, and answer quite a few questions that I’m pretty confident the OP will have at some point ‘pretty darn quick’ if he’s going to be rendering MIDI files for instruments that are ‘offline’ to him at the time he is composing.
Playing the right notes on the right channels isn’t really good enough. Dynamics matter, phrasing matters, getting things to ‘balance and blend’ well matters.
I believe it starts with clicking around in Dorico in the respective ‘neighborhoods’ to achieve different things. Thus, I chose a similar, but a little different and more involved concept (than initially asked for) to guide OP into clicking into those ‘neighborhoods’ and ‘fiddling around’. Undoubtedly, he’ll see and learn way more along the way. To ‘really’ go step by step, would have taken FOUR TIMES as many ‘screen shots’. I just hoped to guide him into the right ‘neighborhood’ as to some methods to understand and predict how Dorico behaves, and develop an ear for rendering good MIDI performances that should translate better across ‘different instruments’.
I can come back later and give specific examples on how to build an organ in Sonic that can simulate enough aspects of an organ performance to at least build a nice ‘reference station’ that’ll help OP learn to make and render Scores at home, away from the actual target organ, that will sound much closer to what he ‘expects’ on the specific organ and room he’ll be performing with.
I haven’t built such a thing in Sonic yet, and others (Like Romanos) may have already built such a thing and might come along and share it. Until then, OP is offered a little tour into a few possible best practices when setting up Scores that will better yield something ‘closer’ to the expected/desired results when moved over to a ‘different instrument’ in a ‘different setting’. The ‘concepts’ of setting up a standard ‘general MIDI’ file can certianly be adapted and applied to remote control of organ consoles.
I’m not yet familiar with that specific make of organ/console, but I have seen and used enough to know that they quite often can store stop groups and such that can be called up via MIDI Program Change. They can often accept a pretty broad range of continuous controller events to operate shutters, rotors, valves, slides, dampers, and all sorts of ‘unique and one of a kind’ goodies that might be incorporated into the instrument.
What I had to offer at the time I saw OP’s question, are a load of concepts in how to establish a workflow that will yield ‘accurate/predictable’ outcomes when rendering a MIDI file; while also introducing the concept that once you build something like an ‘expression map’, it can be saved/exported, and reused in new projects later.
Tapping into even a fraction of the potential of many of the remote controllable organ consoles out there will be far more involved than my little 'Let’s render a ‘GM I’ format version of an old Symphony. Wander around the neighborhood…see where a few of Dorico’s tenants are living. Once a user has met those tenants, how to get them involved in the organ performance should soon begin to make sense.