Is My Backup Method Too Big? Looking for Advice on tar.gz Usage

0
0
Asked By CuriousCoder92 On

I've been using a backup routine on my Linux home server where I connect an external USB drive every other month. My backup script archives relevant folders into a compressed tarball named backup_date.tar.gz. However, over time, this backup has grown to a whopping 1.3 TB and takes about 3 days to complete. I'm wondering: Is this tar.gz file too large for a single backup? Should I consider switching to an incremental backup method instead? I'd appreciate any guidance or suggestions for improving my backup process.

1 Answer

Answered By BackupNinja77 On

First off, one backup every couple of months might be risky. If something goes wrong, you could lose up to a month’s worth of work. Instead, have a look at tools like rsnapshot, which uses hardlinks for files that haven't changed, making recovery a breeze since you can navigate through your backup history easily.

DataDude88 -

Is rsnapshot still actively maintained? I had issues with it before due to complexity during recoveries.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.