I've been using a backup routine on my Linux home server where I connect an external USB drive every other month. My backup script archives relevant folders into a compressed tarball named backup_date.tar.gz. However, over time, this backup has grown to a whopping 1.3 TB and takes about 3 days to complete. I'm wondering: Is this tar.gz file too large for a single backup? Should I consider switching to an incremental backup method instead? I'd appreciate any guidance or suggestions for improving my backup process.
1 Answer
First off, one backup every couple of months might be risky. If something goes wrong, you could lose up to a month’s worth of work. Instead, have a look at tools like rsnapshot, which uses hardlinks for files that haven't changed, making recovery a breeze since you can navigate through your backup history easily.
Is rsnapshot still actively maintained? I had issues with it before due to complexity during recoveries.