What’s the Best Way to Handle Long-Running Airflow Tasks on AWS?

0
16
Asked By TechWhiz123 On

I'm currently transitioning a local Airflow setup to AWS and plan to utilize Amazon MWAA. My Python tasks tend to run for a long time and demand significant processing power (we currently use GPUs locally). I'm trying to figure out the best approach for handling these tasks in AWS. Should I go with Fargate in containers, AWS Batch, or a cluster of EC2 instances? Any advice would be greatly appreciated!

2 Answers

Answered By DataDynamo99 On

Using AWS Batch with EC2 is a solid option. MWAA actually has an AwsBatchOperator that can help manage this pretty well, so it could streamline the process for you.

Answered By CloudExplorer88 On

If your tasks take a long time and involve heavy GPU usage, you might as well just admit you're training AI! For bulk jobs, AWS Batch or ECS could simplify things, but if you're looking to save money, regular EC2 reserved instances might be your best bet. Just a heads-up, I'm not entirely sure if ECS supports the top-end GPU instances you might need.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.