Friends: at your request we’ve loaded up a re-watch of last week’s episode of chasejarvisLIVE to my YouTube channel [subscribe here] for your review and perusal. If you’re at all interested in our digital photo & video workflow and backup, then this is worth your time. It’s a follow-up to our popular workflow video and post where we discussed and reviewed our entire digital strategy from capture through to delivery of final files to the client…even how we backup our daily work, our email preferences, and our music. You name it. The gear, the plans, the whole mess.
I’ve said it before, but it would be remiss not to mention it again: I think this topic is one of the most important fundamentals–not just for professional photographers and filmmakers like us–but for anyone with valuable digital content that’s worth backing up. In this episode, Scott, Dartanyon, Erik and yours truly took live questions from a worldwide twitter audience and addressed in depth earlier questions that we received from our recent post–about everything the basics of our strategy to the subtle and the scalable parts that make this worthwhile to almost anyone.
We’ll keep an eye out for the comment section below for this post and do our best to answer any other topics/questions we may have missed. Surely there’s plenty.









Hi guys, PLEASE PLEASE PLEASE help as my system was designed from scratch and seems almost identical to yours except we use Caldigit Raid 5 Arrays instead of the rack mounted Gtech unit. What I want to know is, if you never delete footage (which we also go for) how do you do a sync’d off-site copy of what’s on that GTECH Raid / Server? We have been relying on the RAID 5, like you, but what if it’s stolen or burnt to a crisp? I know you said you have an off-site drive – 8TB Gtech or something. But 64TB is a lot more than 8TB so how do you split that up and maintain a sync’d off-site copy?
It’s the last piece in my puzzle and I’d suuuure like to find it!
By the way… big fan… big!
Hi Chase and team
Some specific questions I still haven’t found an answer for:
1. I remember in the first workflow and backup post on the blog you stated to never download or backup through any piece of software, but instead just use OS functionality (plain copy-paste) for security reasons (having a clean copy of the original RAW file, software could be buggy). Looks like you do it differently now if I understand it right (downloading and backing up through Aperture). Why?
2. How you verify the downloads from the CF cards? Any tip how to beware of surprises later (corrupt files), if you have a big shoot with 4000+ images every day and maybe not the time to inspect every image in LR/Aperture before reformatting the cards? I had corrupt RAW files caused by a bad card reader lately. Re-downloading the cards with another card reader solved the problem. But that was only possible because I hadn’t reformatted the CF cards yet. I usually check amount of files and actual byte size after a download, but this did not reveal the corrupt images created by the bad card reader. Scary experience. How to beware of that in the future?
3. How you verify the data integrity when you move files around (e.g. from location drives to the sever in the studio and later to the offsite drives) and be sure nothing got corrupt on the way there? Do you use any software to check? I can’t imagine you use a bit by bit verification as this would take a very long time for the data you produce.
Thanks to clarify!
Chase,
What do you think of Final Cut Server? I know you use Aperture for cataloging.
Did you consider Final Cut Server over Aperture for the backend?
Thanks,
Juan
good info thanks!
Hi Chase!
Do you have only one, big Aperture Library, on your server, or do you have an aperture library per each project?
And in the part of the server where do you storage your RAW files, this files are in a library or just in folders?