Servers: A few questions

I’m thinking about getting a server for my system. I have recently been having one nightmare scenario after the other with failing drives. I got a bunch of 3TB Seagate Barracuda Drives that I bought in bulk about 5 years ago(??). Two months ago I lost one that was my video backup drive. Yesterday, I found that my audio drive was now failing (offloading like crazy to other drives to save files!!). The tech told me that Seagate wasn’t a particularly good brand for doing the kind of work I was doing. He said I should be using Western Digital Drives as my bottom line choice for DAW/Video/Media work. I’d never heard this before. I’ve used both Seagate and WD drives (seeing absolutely NO DIFFERENCE between the 2, or any other brand for that matter) for over 20 years with essentially no problems. So, this came as a big surprise to me.

When I talked to the techs at Drivesavers regarding the currently failing drive, they said that they weren’t aware of a “Quality Issue” with Seagate or any other drives, regardless of status (HDD, SSD, Compact Flash Cards/Drives etc.). They recommended the old tried and true “back-up in 3 sources method” as the safest possibility.

So while I was considering replacing all of my HD drives with SSD drives, I started thinking about getting a server with some kind of automated backup capability. I thought I had a pretty good system backing up to multiple drives until those same multiple drives started failing (this is the second in 3 months!). Now, with everything updating every (apps, OS, Licenses, etc.) couple of days/weeks/months, it’s getting overwhelming trying to keep track of all of this. It’s just me. It’s hard enough to stay on top of the Craft & Gear plus making deadlines for actual work!

I’m not comfortable with cloud resources because of theft/leaking issues. But, maybe if I could set up a server to feed cloned PC’s and automatically backup everything nightly, I could actually get back to something resembling work-flow.

Any suggestions on models, features, etc.? Is this overkill? Is there a simpler way to manage all of this? What do you guys think?

It really depends on (a) how much you’re prepared to spend and (b) the total amount of storage you want secured. I’m going to assume you have a very large volume of data, as that’s the only reason to use mechanical drives and not SSDs exclusively these days.

In a nutshell, don’t use single drives, build RAID arrays. You don’t necessarily need a server, in fact many high-end motherboards these days have a RAID controller built in. The problem is, it’s not trivial to configure and maintain if you’re not technically-minded, and it needs to be done before you install anything. That, however, combined with a seperate NAS box with it’s own RAID array(s) and an automated backup regime would be reasonably safe.

With your previous problems I’m almost inclined to ask if you have electrical problems of some kind in your facility… anyway…

Sorry, I know you said it made you uncomfortable, but…; I actually think it’s pretty safe. All of the major companies make their money off of storing people’s data. If there was a data breach then that would ruing their reputation and thus their business. I’m not sure how much money you or your company makes but I’m guessing that companies like Backblaze have clients far bigger than you. So if you have a problem with content ‘leaking’ from their servers then their other clients will be worried. So really I think there’s a huge incentive for those companies to maintain stellar security protocols.

On top of that you’d have to wonder who would go through the trouble to steal your data. No offense, but I don’t really see why anyone would care in the first place.

Now, I’ve just worked on two series that required me to sign NDAs. But in both of those cases the actual content I’m working on looks virtually undecipherable. “THC_PAWN_231515_CRS_000_2398_60_20200416_03.mov” for example. We know it’s a movie file, but for what? So anyone wanting the data I’ve stored on the cloud would - if they could read that string in the first place - have to either decipher it to see if it’s worth taking, or would have to watch the movie off of the Backblaze servers.

But Backblaze actually has built-in encryption. Not only that but (supposedly) the encryption happens on my end before the transfer to their servers. So any potential thief would have to access their servers, then crack the encryption, then make sense of the data they’d be sitting on. I would be of zero interest to them despite those NDA shows.

So anyway, I really do think that cloud backups in 2020 are a legit way to go. People in our industry have been doing it for a while and it’s an easy and convenient way to backup data off-site.

I set up a server for a studio at one point, but it was like almost two decades ago (cloud backup wasn’t feasible at the time). It was a dual-CPU server build with a RAID 5 array in removable hard drive trays. It also had DDS4 tape and a DVD-R. On that we ran Retrospect for software. The way I set it up was

_- as a file server of sound effects and a music library for the composer (“near-line”) where all of that data existed on other media as well,
\

  • as a backup system where backups would be executed automatically onto the RAID array, which using RAID5 was redundant in itself and I did swap drives a couple of times where the array rebuilt itself. This array was off of a RAID card (forget the brand),
    \
  • and as an archival point where everything that was to be stored long-term ended up on DDS4 tape and on DVD-R._

It was a great setup for the money, managed by me, and while it took some learning wasn’t too bad at all. What ended up being a pain was networking with Apple computers (I hate Apple). Anyway, I do like the solution and it was easy enough, but that’s easy to say and people are of course different.


What I would say though is that I’m not so sure I’d get an actual server again. I think there’s a case to be made for using network attached storage for example and simply trigger backups from a software on your individual workstation(s). This way you have the data outside of your computer so you could “easily” restore it to another computer without having to open up your case and screw around with drives.

Right now I’m using Macrium Reflect and it has a bunch of options for how to create your backup sets. You can set the software to shut down automatically after a backup set has executed, and I believe you can create a script/icon and drop it on your desktop. So at the end of the workday you can double-click on that icon instead of shutting down your computer and it’ll back up and then shut down. Of course you can also schedule backups using it.

I have an internal drive that I’m decommissioning in favor of a 6TB drive. The 6TB drive is now external in a dock and perhaps I’ll move it into the case. But I’m also doing Backblaze for off-site backups. The one thing to consider with something like Backblaze is that the initial backup takes a long-ass time. Like days. Then it’s fine. The second thing is that you have to understand their data retention/deletion rules. For Backblaze I think they keep data for 30 days after you delete it off of your system. This means that if you have a file on your work drive on January 1st, back up, and then delete it on January 2nd, it will be deleted off of their servers one month later. So it’s literally a backup of whatever files/drives/system you point to. Take something off your system = it doesn’t get kept indefinitely.

I would also offer an argument against using a RAID array in your computer and run off of the motherboards controller (or in software). My reasoning for that is that you need the controller/software to access the data, and if the motherboard dies I’d be worried about doing that. With something like Macrium and other software I believe access is easier. I could be wrong of course.


So I’d probably recommend maybe a network-attached RAID storage and a software that backs up to it, plus could backup. You could do direct-attached storage as well of course and that maybe speeds up data transfers if that’s an issue. If you want to keep the drive internal there’s always the option to get a tray for easy removal.

My setup is:

  • A “server” (=my previous workstation) with a RAID set of disks.
  • A local USB3 connected external drive dedicated to Macrium backups of the whole system drive (I only have one drive on my workstation).
  • Windows File History set to backup to the RAID on my server.

If you don’t have any old hardware that can take the role of server, a NAS with built-in RAID is a good solution. RAID in any configuration is a must if you want to survive single disk failures.

Windows File History is an often forgotten feature that is actually pretty cool. I have configured File History to make backups of all important parts of my local drive (to the server RAID) every hour (i.e. if I make a change to a local file, that version will be backed up within an hour). And the cool thing is that all versions are kept on the RAID disks and I can not only restore the latest version in case of a local disk problem - I can restore the state (for example a Cubase project folder) from any time in the past.

a dedicated server for storage and backup is a waste of money in 2020
the CPU will get bored very soon :wink:
use a NAS system instead