What’s the best way to back up 60TB of legacy data in manageable chunks?

0
4
Asked By LurkingGiraffe92 On

Hi everyone! I'm facing a challenge with backing up over 60 terabytes of data accessible through Windows file sharing. I need to find a way to preserve this data permanently, but I don't have a single location to store such a massive amount. Unfortunately, cloud storage isn't an option, so I'm looking for software solutions that can compile this data into manageable .iso or .dmg files without splitting files or directories. Ideally, I want to be able to pause the process and resume it later as I switch out hard drives. I remember a time when Retrospect on Mac could handle this, and I'm wondering if something like Veeam Community Edition, ddrescue, or MacOS Disk Utility could assist me. Any thoughts would be greatly appreciated!

1 Answer

Answered By TechieTurtle07 On

One option is to check if the original storage has enough space to create a compressed version of all the data. You could zip the files into chunks that fit the drives you're using for backup. However, keep in mind that you'll have a snapshot of your data as it exists now, meaning any new data won’t be captured. It might be better to deal individually with parts rather than trying to back everything at once.

DataDynamo23 -

This is a one-time thing for failed data, so I just want a read-only, exact duplicate. Once I’m done, I can power it off and call it good.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.