I would like to see the ability to select the articulation PRIOR to drawing in notes.
If I want to draw in a Choir Ooh, but Ooh is 2nd after Aah in the EM, I have to draw in an Aah, and then change it to an Ooh. Aaaah!!
And the transport resetting the articulation is also rather annoying.
P.S. 27 Votes. Iām pretty sure thatās the most voted for thing on the forum no?
P.S. 27 Votes. Iām pretty sure thatās the most voted for thing on the forum no?
No, but itās a Top-3 feature. Just click on the āVotesā heading at the top of the Cubase forum and youāll see the ranking for the chosen category tag.
(There are some popular requests that have duplicate threads, so to have an exact ranking one would need to sum those duplicates. Anyway, this FR is among the top ones.)
I hope that Steinberg will soon change their EM system. Skimmed the thread and havenāt found those suggestions below (if I missed it, my apologies)
I would love the ability to change articulations with key commands.
Unfortunately my streamdeck is not very fond of midi changes (note on/off,). And switching articulations with key commands (find and apply sound variations) does work wonderfully in S1 (and overall more reliable).
Another benefit of it: I am able to change the written ariculation data with key commands as well (no need for extensive mouse action, this makes the audition part so much easier)
+1
and sorry if I missed these requests below but I did look. Listed in order of preference:
Variable midi channel control INPUT, so we can switch arts from a second midi keyboard on another channel. And for those like me who like to use program change (so that more playable key range is available) include a midi note to program change transformer for that second midi control channel only.
Along with the list in the inspector being stretchable to include all arts, give us the ability to click them as well so that with the mouse we can change arts from the inspector (and record that)
to be able to grab EM event objects in the editor and truley move them around just like we can for midi notes. Put them in a separate window to the right or left
Allow a Negative delay āperā articulation in an expression map which would be a really good feature especially for orchestral based patches that have different timings to them so they can line up on the grid. Many composers have to use one track for each articulation because of this limitation. Having it in an expression map would be a game changer.
This video shows why composers would want negative track delay and why they are currently forced to use a separate track for each patch instead of using keyswitch or expression maps for this approach: Why You Should Quantize + Negative Track Delays Explained - YouTube
This is the 2nd most popular feature request. I hope Steinberg will comment on any roadmap. I heard Steinberg acknowledge that it is high on their list back in March but it didnāt make it into Cubase 12. Will we see something in Cubase 12.5/Nuendo 12.5? There have been no improvements with Expression Maps for a long long time.
+A Zillion +1s .
What an excellent Idea to have negative delays per articulation in Expression maps - genius!
Also, I would like to add, simplify the interface of the Expression map editor, by having a āsimpleā and āadvancedā interface (AKA Kontakt instruments) .
The main reason many composers still use one track per articulation is because of negative track delay control (IMO). Personally, I find this is just TOO much clutter.
Controlling negative track delay in expression maps per articulation would enable composers to reduce their track counts by using Expression maps/ keyswitches and capitalise more on the ideal of one instrument per staff/track. Which is how it should be.
IMPORTANT GUYS: Steinberg currently have a Customer Feedback Questionairre. If all you guys could support the OPās greate suggestions re expression maps on this tool, or make your own ideas re expression maps this may well help:
It would be great if the Virtual Instrument could be detected as far as how many patches are loaded inside and then hit a button that would import that data and create Expression Map. Of course, one can modify or tweak that expression map but it would go along way with helping to create one. Another thing to note is how Studio One works with VSL where it just knows what is loaded in Synchron player and the Expression Map is already setup ready to go.
If there could be a little check mark next to each map signaling if it is used in the session that would be fantastic. Itās nearly impossible to clear out old maps of a big template without this.
Here Here re negative delay compensation at the moment Keyswitches are unusable for pro sounds as the negative track delay is set up for the track, not per articulation, meaning timings are out, globally.
The direction thing does not work in the drop down list in the info line. Anything marked ādirectionā does not show up. I and most I believe mark things attribute, If I need to have say a group of notes marked staccato, its easy to just lassoo them all. No need for directions at all IMO.
We could also be shielded from a lot of the needless complexities with a basic/then advanced mode
The way that naming works in the various fields of the Expression map editor is just plain wierd.