Dorico 4 - blah! - Why these massive files?

You said you are using a SSD Raid system, so you will get a read rate of somewhere between 600Megabyte/Sec and 1.2 Gigabyte/sec… what was your filesize? Reading the file in is done in a fraction of a second in your case, the question is what is running in the background of your system. It is not just the apps that are visibly open, Windows has tons of stuff running in the back by default.

Not sure why a tiny megabyte file is a problem in your eyes, I’m working with files larger than 10-15 Gigabyte in my daily job and even that is seen as standard file size (real files are in the three digits gigabytes)

Regarding plugged in and not plugged in you should check the power management settings of your machine and see if it even is configured to run at full speed. At least in “not plugged in” mode it will reduce speed dramatically, if you haven’t changed defaults.

Yep. Let’s move on.

1 Like

What annual payments?

The OP probably means regular upgrades.

I think it’s pretty clear by now, that the file size by itself is not the issue. I can load several audio files simultaneously and play them back almost immediately.

Actually I am quite surprised that no one in this thread mentions, that if the initial proposal of the OP was achieved (smaller file size), it would lead to probably even longer loading and saving time.
It all comes down to compression. The question is: if my file is 10MB in size on the hard disk, do I see 10MB of information once it’s opened?
The answer is clearly „no“.

To give a more specific example: Dorico could either save the exact position of each Element in the final visual form (= big file size), OR it could just save the information in a more abstract form and just recalculate what the layout should look like everytime the file is opened again (=small file size). The ladder obviously requires to do calculations before the file can properly displayed, and even worse: calculations that were already done the last time the file was open.
Similar thing goes for saving: the displayed information needs to be first compressed, which requires CPU Power, before being able to be stored.
This is of course true for any kind of data and any compression algorithm/type.

This why audio folks record in .wav instead of .flac (which would be even a lossless compression!), and that’s why Video folks work with ProRes files instead of H.264 encoded files (in this example, the compression wouldn’t be lossless anyhow, but computational power is a significant factor, too!).
These examples clearly show, that if you want time efficient processes, what you actually aim for is to have the data in the least compressed version available/suitable for you ⇒ bigger file sizes are actually faster.

1 Like

Dorico’s file format is already a zipped archive of several data files.

my point exactly. Further compression just leads to even longer loading and saving time.

3 Likes