How to Manage Long-Term RDS Backups Cost-Effectively?

0
0
Asked By CuriousCoder22 On

I'm dealing with a compliance requirement for long-term backups of our RDS (PostgreSQL) instance. We need to store daily backups for 270 days, but keeping everything on AWS Backups is turning out to be quite expensive. Right now, I'm only backing up for 90 days. I want to find a way to reduce costs while still meeting compliance standards.

Using the Export to S3 feature isn't ideal since it only exports to Parquet format—I need to restore from a specific day using pg_restore instead. I'm considering setting up a scheduled Lambda function using EventBridge to do a pg_dump with compression to an S3 bucket (which would have compliance lock). I might also implement AWS Backups or automated snapshots to allow users to restore backups within a faster recovery timeframe like 30 days.

What do you all think? Am I missing any key considerations? I know the S3 route could still be costly, but I think it might be more manageable than AWS Backups.

4 Answers

Answered By BudgetHawk94 On

What’s your target budget looking like? That could help narrow down the best solution for you.

Answered By CloudNinja77 On

You need to keep in mind that Lambda has a time limit of 15 minutes. If your backups take longer than that, you might want to switch to using an ECS task instead. Also, you could simply store all backups in the S3 bucket and transfer them to Glacier with some retention policies; that might save you money.

CuriousCoder22 -

Yeah, I totally overlooked that Lambda limitation. Thanks for bringing it up!

Answered By DataDynamo89 On

You really need to think about the total cost of ownership here. Using AWS Backup is easy and reliable since it’s a fully managed service. If you suddenly run into issues, who’s gonna monitor your custom Lambda backup solution? Also, can you provide evidence in case of an audit? That’s where the hidden costs might come into play. At my last company, we used a similar setup with Oracle databases and S3, but honestly, the complexity wasn’t worth it when AWS fixed their Backup integration. You might find that AWS Backup actually saves you money if you consider the possible changes in daily backups and compression rates—most of the time, it's less than you'd think.

By the way, if your daily changes are only about 10%, that could significantly reduce your backup costs with AWS Backup, as you'd only pay for the incremental changes. I crunched some numbers and it came out to around $832 a month for 400GB with a 10% change rate, while storing full pg_dumps would easily surpass that.

LambdaLover47 -

You mentioned a daily rate of $50 for backups—have you figured out how many months you’re planning to keep your backup data? That could change things a lot.

CleverTechie50 -

That’s interesting! I’ve noticed a similar cost trend with daily increments being more manageable.

Answered By BackupBuster33 On

Honestly, AWS Backup isn’t the best out there. You might want to check out Commvault or alternatives like it. They might provide better options for your needs.

CuriousCoder22 -

I’m definitely going to take a look at Commvault. Their pricing details aren’t very clear, though. Any idea how their costs stack up?

DataDynamo89 -

I don’t have a clear breakdown of costs either; I agree, the vague pricing can be a red flag. Here's their pricing page: https://www.commvault.com/packaging.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.