A lot of the simpler correspondences between notation and playback have been dealt with but I’d guess that work is being done on the expansion and contraction of repeats, bar repeats, and tremolos, and also the interpretation of ornaments and pauses.
A really well implemented conductor track would take Dorico to a new level of expressivity.
Hopefully a lot of the mainstream notation requests will finally be met…
as Dorico 4 largely ignored Expression Maps to focus on other priorities (or perhaps it was Paul being assigned to other duties), I would also expect new developments in this direction. Dynamics dependent patch changes were already flagged at the time of D3.5 as being in the plans for future automation development if I remember correctly and it would certainly make sense to be able to reprogramme and perhaps increase the number of the five current note length boundaries.
Ability to take audio device inputs into the mixing console (External Instruments).
Ability to make/remove AUX send channels at will (like the one that currently hosts a reverb by default).
More options in expression maps for implementing legato interpretation for various instrument libraries.
Lots and lots of tweaks to optimize overall performance and fix odd bugs.
New Licensing system gets kinks worked out that improve load times and efficiency. (eLicenser version open instantly for me. The new system takes like 3 or more minutes for the UI of anything HALion/Groove Agent to open, no matter the host involved…be it a Steinberg one, standalone versions, a Plogue engine, doesn’t matter. Opening HALion/Groove Agent plugins that use the new Lisencing system take FOREVER on my rig. Why? Rolled back to the dongle version and it’s snappy again).
More options for group/educational licensing and mass deployment.
Lots of complaints when Apple decides to break something (Maybe Rosetta, Maybe something else) that forces a lot of people to rework significant portions of their older projects (involving VST2 for example).
Workflow and UI improvements (particularly in Play and Engrave areas).
Continued improvements on things like importing/exporting MIDI and XML.
“People complaining” is certainly an accurate prediction!
Yes, Rosetta will be removed at some point; though developers should already be providing Apple Silicon versions, more than 2 years in. Luckily, most are doing well on that front: Dorico, Noteperformer, Kontakt , Opus, Sine Player – even ARIA Player! – are all ARM-native.
Apple learnt a lot from the PPC-Intel transition. Even after six years, some developers acted like it was a complete surprise when Apple pulled the plug. (QuickBooks, FontLab) Hence the warnings to users in Big Sur and Monterey that 32-bit software needed updating (refer to the developer!) – before they pulled the plug in Catalina. (I usually try to rid myself of the old architecture as soon as possible, to avoid being left without a chair when the music stops.)
Speaking of complaints, some people still lambast Dorico for not yet including “feature X” after all this time – I have no doubt that Dorico 5 will answer many of those gripes, and contain more features that we didn’t know we needed, like Condensing, Flow Headings, etc.
The basic idea is that one would play to indicate the pulse (it could be nothing more that taps on a keyboard) and the tempo would be extracted and mapped to the bars and beats. Taking it further, one could play a section of the score (perhaps a bar or two with a colla voce, or a rit. and an a tempo, maybe a guitar solo or anything else with rubato) and Dorico would map the played rhythm to the notation. One would be able to edit the mapping manually. The ‘conducting’ could conceivably be extracted from a recording but there’d essentially be no difference.
Got it. There’s already a Tap Tempo button under the tempo side bar. Anyhow FWIW I find tempo indications and rubato fluctuations a small part of humanizing, and getting CC1/CC10 dynamics/intensity/quality correct the bigger job - balancing basically.
Agreed, the ability to humanize tempo changes in real time would be absolutely massive. It makes it possible to create tempo changes based on listening and responding to the music in real time, as opposed to drawing a line on the screen and then listening back to see if you got it right.
I could also derail this thread by complaining about the robotic-ization of music through rhythmic quantization (drum pads, loops, metronomes), but I won’t!