What are the best strategies for backing up and restoring a self-hosted Docker PostgreSQL database?

0
11
Asked By Curious_Cat94 On

I'm launching a SaaS project as a solopreneur, and I need a reliable way to back up my customer data without service interruptions. I'm considering hosting my backend, likely Django, on a VPN. Should I maintain a second instance for automatic replication, or would it be sufficient to set up cron jobs to create tarballs of the data files and transfer them via SFTP? Additionally, if I go the replication route, how can I configure a proxy to switch traffic to the secondary server if the primary fails? I'm also contemplating whether to use Supabase, but I'm concerned about handling a high volume of writes versus the flexibility I may need for my business model. Any advice would be appreciated!

3 Answers

Answered By SafeGuardGuru On

Definitely consider the legal aspect too. Have you consulted a lawyer to draft a contract that limits your liability in case of a failure? It's important to have that covered even if you think it won't happen.

Answered By TechieTribe On

If you're running everything on AWS or Azure, a great strategy is to back up nightly using a Django Celery task, dumping into S3 or blob storage. For a self-hosted solution, you could use MinIO or similar. Just set a policy to clean up backups after a specific number of days to keep things manageable. It keeps your tooling flexible!

DataDefender101 -

That sounds like a solid plan! Are you using a specific command to create those dumps, like `python manage.py dumpdata myapp > myapp_data.json` or something you’ve automated?

Answered By PGWhiz On

I've been using pgBackRest for my PostgreSQL backups, and it's been really reliable. Just recently, I upgraded from Postgres 16 to 17 and restored without any issues – everything went smoothly!

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.