How can I set up GPU processing for video streams on AWS?

0
9
Asked By CreativeCoder42 On

I'm looking to run a 3D reconstruction algorithm that leverages GPU (CUDA) capabilities. Currently, I'm using a local Docker environment to manage this process. However, I want to migrate everything to AWS. I've found out that AWS Lambda doesn't support GPU workloads, but to optimize costs, I want to ensure I'm only charged when the code executes. My goal is to have the infrastructure set up so that it gets triggered every time my server receives a video stream URL. I'm considering an architecture that looks like API Gateway -> Lambda -> EC2/ECS. Is this feasible?

3 Answers

Answered By ServerSavvy On

This sounds like a fun project! I recommend implementing some queue management. You can set it up as API Gateway -> Lambda (to handle incoming requests) -> then push tasks to an SQS queue. From there, use AWS Batch to manage jobs. It scales your EC2 instances as needed, so you avoid paying for idle GPU instances. If real-time processing is crucial, you could keep an instance running 24/7 for immediate tasks, though that could get pricey depending on your needs.

CreativeCoder42 -

About the EC2 instances launched by AWS Batch, how do I make sure they use the Docker image I built? I heard of ECR but I’m worried about its image size limit; my container is about 20 GB.

Answered By TechieTom On

Absolutely, that's a pretty standard setup! Just curious, how often will this process kick off? Also, how much time does it take to process those video files? Are you doing any transcoding? AWS has some tailored solutions for that.

CreativeCoder42 -

It’s not really frequent, maybe just a couple of times a week. The reconstruction process usually takes about 5 to 10 minutes. No transcoding here; I’m using photogrammetry with COLMAP and OpenMVS.

Answered By CloudNinja99 On

You might want to check out AWS Batch for this. You can define a job using your Docker container, and AWS Batch will manage spinning up an EC2 instance as needed. It closes everything down once the job finishes, which keeps costs low. You can trigger the batch job through API Gateway or S3 events.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.