How can I backup a 363GB Postgres database directly to S3?

0
1
Asked By CuriousCoder92 On

Hey everyone, I'm looking for advice on how to back up a large Postgres database with 363GB of data directly to Amazon S3. Unfortunately, I can't do it locally because I don't have enough disk space. I was thinking about using the AWS SDK to pipe the output of pg_dump to an S3 upload function. This is important since I plan to store the data for a long time and might move it to S3 Glacier for archiving. I wanted to check in here before diving into the documentation. So, is it possible to pipe pg_dump to an S3 upload function like s3.upload_fileobj for such a large database?

5 Answers

Answered By DockerAdmin On

I recently set up a similar service at work using Docker. You can create an ECS service on Fargate to run a scheduled task that dumps the tables you need and sends them to S3 via AWS CLI commands. Just set your EventBridge cron schedule to run at midnight or whenever you prefer.

Answered By DatabaseGuru On

Yeah, you can definitely pipe pg_dump to s3.upload_fileobj, but it might be a bit tricky. You'd essentially be doing a multipart upload. Some boto3 methods can handle this for you, but you might have to manage a bit of it on your own. Alternatively, consider spinning up an EC2 instance with enough disk space to store the dump and upload to S3 in two steps. This might be easier, depending on your network setup.

Answered By BackupBoss99 On

You might be able to do that, but a simpler solution could be to just plug in a 1TB external hard drive and use pg_dump to back it up there instead.

Answered By CloudNinja On

Check out the pgbackrest plugin; it has built-in S3 support. This tool allows for incremental backups, which could save you on transfer sizes and storage costs.

Answered By TechSavvyDude On

You can absolutely do this straight from the command line! I had a setup with an 8GB instance and managed to back up a database a couple of hundred GBs in size:

```sh
pg_dump | aws s3 cp - s3://bucket/backup.sql
```

For better compression, you can add gzip like this:

```sh
pg_dump | gzip | aws s3 cp - s3://bucket/backup.sql.gz
```

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.