Assignment of VST Instruments

The most annoying thing in Sibelius and Finale is the assignment of instruments - and Sibelius is even worse so to speak.

Is the assignment similar to Cubase where you just select the player - channel and Xmap and to save those as a preset?

Although I have read in other posts that on one hand Dorico allows to use Xmaps and on the other hand there will be a track for every articulation which is quite common for film composers.

Would someone be so kind to spread some light on this? :slight_smile:

It’s hard to give a chapter and verse answer on this, since all of the playback functionality is very much in its infancy at the moment.

Allow me to introduce the term “endpoint”, which may well not end up as a user-facing term in the application, but which is nevertheless for us an important concept. An endpoint is a combination of virtual instrument (device), channel, patch (if known), and any required switch information (e.g. a particular key switch or a specific MIDI controller set within a particular range, etc.), that produces a playing technique for an instrumental sound.

If you’re using the supplied HALion Sonic SE 2 player and its own factory content or HALion Symphonic Orchestra (HSO), then the assignment of instruments to endpoints is automatic: Dorico examines the instruments held by the players, and knows which playing techniques are provided by the HSO instruments, so it can instantiate as many instances of HALion as it needs and load the appropriate HSO patches automatically.

If, however, you’re using any other virtual instrument, then this allocation of instruments to endpoints has to be done manually. You might, for example, load up Vienna Ensemble Pro and load it up with 16 channels of patches. Dorico can’t see inside VE Pro, so it doesn’t know what patches you’ve loaded there (beyond their names), and nor does it know anything about the switches those patches may use to access different playing techniques.

If a suitable expression map is available, then you’ll be able to link that somehow to the instrument’s assignment, so that Dorico knows about the capabilities of that endpoint, and will then be able to perform the necessary switches to access the available playing techniques more or less auomatically.

If no expression map is available, then it’ll be up to you to define the capabilities of the endpoints yourself. You will be able to specify what device, channel, patch, and combination of switch mechanisms should be used when a given instrument plays a particular playing technique.

The hope is that it will then be possible to save this information – i.e. the specific set of virtual instruments loaded into the rack, together with the information about which instruments and playing techniques are accommodated by each endpoint within those instruments – in such a way that it can be applied to other projects, as a kind of playback template.

Daniel, thanks a lot - although you are just about to connect the playback feature your describtion draws a very clear picture of how it will look like in the release.

I have heard this question several times so your answer will def. clear this up… :slight_smile:

Could a Cubase user who is using expression maps comment on what one does in the case where an expression map does not exist for a particular articulation?

I ask because there is a method that the Logic Pro X users are developing to deal with this situation. Logic does not have expression maps at all. But it can associate an articulation ID with every MIDI event. So some adventurous developers are creating a mechanism to store an articulation ID with every note during recording/note entry. Then there is a MIDI-effects “scripter” plugin that can be inserted in an instrument’s channel strip to map the articulation ID back to a keyswitch or whatever the sample library expects for a given articulation. The articulation ID acts as a generic set of articulations that can be uniquely defined for each instrument in any sample library. Works sort of like a codec where you encode the keyswitch information to an articulation ID during recording/note entry and then decode the articulation ID back to a keyswitch during playback.

This is done to eliminate the extraneous keyswitch in the score for readability and also solve the chase problem since logic does not chase embedded keyswitches. Expression maps don’t have this problem, nor would a notation program since this is all done using articulation marks and text to denote the articulation. On the playback end, the articulation ID allows one to generate simple scripts to target specific sample library particulars.

So does Cubase audio engine have a scripter or something equivalent? How do you deal with this today?

The Xmap is some sort of an interpreter which allows the user to assign specific articulations to graphical signs or Text such like staccato, tenuto, slurs, senza vibrato, etc… to name a few.
Also combinations of up to 4 different signs/text are possible. The user has detailed features to control the output. (KS, transpose, CC, etc…)

When there is no Xmap the GUI in Cubase is just explains itself.

Here is a pretty good video that explains a lot done by Scott Lewis:

Hope this helps

Thanks. That helped. So you can basically invent whatever attribute or direction that you want and indicate it with a symbol or text. So that’s pretty useful and complete.

So when people complain about the expression maps being limited, what are they talking about?

I have no experience with DAWs, Sequencers, etc. so am not sure I understand what is being said in this thread. I do sort of, in theory, understand Sibelius SoundSets. Is an expression map sort of an equivalent of a SoundSet? And if that is the case, is it also the case that if we use a library other than those that come with Dorico we will have to supply an expression map, and we will have to go through the equivalent to setting up a manual SoundSet (rather than the automatic process Sibelius uses with Garritan libraries and SoundSets)?

Yes, a VST Expression Map fulfils a similar sort of purpose to a Sibelius sound set. Some third-party libraries already have VST Expression Maps available for download:

but I think most Cubase users who use Expression Maps at least heavily customise these, or create them from scratch themselves within Cubase.

If you’re using a third-party sound library in Dorico, you should indeed expect to have to follow a procedure a little like setting up a manual sound set in Sibelius to set up for playback in Dorico, but you will have a greater level of direct control over how Dorico handles the extended playing techniques provided by your sample library, if you choose to take it.

Sounds like there may be a bit of a learning curve here. Is there a manual documenting building expression maps? I couldn’t find mention of one on the web.

The term “third-party” is a bit vague when there could be 4 parties involved. Do any of the creators of sound libraries provide their own Steinberg expression maps, or are these all made by the equivalent of The SoundSet Project? I notice that the Garritan expression maps have a date of 2009 which, I think, predates GPO4. If the others are equally old, there may be a lot of work needed. It would be nice to have that work done by someone that knows what he or she is doing. (I assume Steinberg is not going to do it.)

You can look up the chapter on VST Expression Maps in the Cubase Operation Manual, which you can download as part of a zip file from this page:

However, the user interface in Dorico will not be the same as the interface in Cubase, so although this will help with the concepts of working with Expression Maps, it won’t necessarily be directly transferrable to your use of Dorico.

Here are EMs on the Steinberg site:

I haven’t done a great deal of reading on Expression Maps yet. If I was to start working on an EM for a library for which there isn’t currently one available, would that give me a head start on the Dorico end of things?

I know that we want to start working on the Virtual Drumline maps whenever we can, so it would be fantastic if that’s something on which I could get a head start by developing in Cubase—even if it doesn’t yet involve the notated maps.

Yes, you would definitely be getting a head start if you try to set up an Expression Map in Cubase for Virtual Drumline. It would be very interesting to see to what extent you can even practically map the percussion techniques in VDL onto the set of techniques accommodated by Cubase’s implementation – potentially not very far, I think.

Bear in mind too that the Dorico expression map format will be a bit different to the Cubase format, but we’ll have the ability to import them (at least getting all the key switch and controller data in - we don’t need the notation data). The format of them will be quite simple, so if you already had the data in a spreadsheet then there may be easy ways of creating them from that. Maybe we can create a new subforum for people interested in this.

Hi Paul,
what do you mean by “we don’t need the notation data” ?

And, another question : how will the expression maps work in Dorico ?
In Cubase we need to have a “slot” for every possible combination of articulations. For example if we have set “arco” at the beginning of the score, and somewhere in the score we put a vibrato, a slot corresponding to acro+vibrato needs to be present in the expression map for the program to switch to the articulation.
Will it work the same way in Dorico ? Or will Dorico be able to do this combination by itself ?
This way the number of “slots” could be significantly reduced.

The reason we don’t need the notation data is that notation events in Dorico have ‘Playing Techniques’ associated with them, eg tremolo, trill, accent staccato. The playback will find the correct switch in the expression map to play it.

Dorico will be able to find the correct switch in the expression map depending on the combination of techniques at that point. You don’t need the Arco (which we call ‘natural’) for all combinations, so that you can just define’ vibrato’ and it will use that if it has a higher priority than any other techniques at that point. As an example, if you have mute and vibrato at the same time then it will pay using the one with the highest priority. If you have a ‘mute + vibrato’ sound available then this will be used instead. I hope that’s reasonably clear.

Paul,

I would definitely be interested in a sub-forum for Dorico-specific Expression Map development, when the time comes. I’m going to be working with some colleagues on developing template packages of sorts for the Virtual Drumline sound library—in which there are around 150 or so unpitched drum maps that need to be created and another 70 or so pitched instruments—so having a dedicated discussion area for that would be fantastic.

Ok, well I’ll take this up with Daniel

To add to this, and at the risk of repeating myself, it would be very cool if Dorico could easily integrate with third party VSTs.

To give a simple example: let’s say I have a 1st violins staff – and I decide it works really well with East West Hollywood Strings.

Can I, from Dorico, control the following:

  1. Decide exactly which Hollywood String patch is loaded
  2. Decide how the patch is "played’ (expression cc, pitch pend, etc)

Thanks!!

Yes and no: the key part in the process is that you will need to create both Playback Template and Expression Maps. (We may be able to have some way of sharing Expression Maps within the community, which would minimise the amount you need to do, but this may not be practical for things like VSL that can be set up in a totally custom manner).

The Expression Map tells Dorico how to play arco, pizz, staccato, etc via keyswitches or controller changes. The Playback Template is effectively a blank score with all the plugins set up how you like them (including EQ/compressor/insert/Send FX settings), and your preferred sounds already loaded in each channel. You then assign each Dorico Instrument to your preferred plugin and channel, and set the expression map for it. If you wish you can also assign different techniques to a different channel, eg if you prefer a different violin tremolo.

What we don’t have the ability to do is automatically load the correct patch, because this would require custom code for every different plugin. We will have this ability for Halion because it’s our own.