Best Method for Uploading Large Videos to S3: Presigned URLs or Multipart?

0
0
Asked By ChillNinja42 On

I'm working on an app that lets users upload videos, some of which are over 100 MB. I've been exploring the use of S3 presigned URLs to bypass the need to route large files through my API, as I've done this before. After diving into the pros and cons, here's what I've gathered:

- **Presigned POST**: It supports setting a `content-length-range`, but I hear it's not ideal for large files.
- **Presigned PUT**: This one seems straightforward, but it doesn't enforce file size limits on the server side.
- **Multipart Upload**: It's more suitable for larger files and allows for retries, but it also doesn't inherently enforce size restrictions.

Here are my possible approaches:

1. Go with presigned PUT and rely on client-side validation (which doesn't feel very secure).
2. Use multipart upload with post-upload validation via Lambda, but this method only triggers after the upload completes. That means I can't prevent someone from uploading something huge, like 10 TB. To mitigate risk, I was thinking of using short-lived presigned URLs and limiting the number of parts (like <5 parts, <5 minutes) to help control the situation.

Is this plan reasonable? Is there a way to enforce size checks before a multipart upload starts? Also, for files around 200 MB, should I just use PUT, or would multipart be unnecessary? Thanks for any insights!

4 Answers

Answered By VideoUploader101 On

How strict do you need to be? You can easily check the file size with JavaScript before uploading and prevent oversized files client-side, then use multipart uploads for the rest.

Answered By S3Fan99 On

If your files are only up to 100 MB, I wouldn't worry too much about multipart uploads. They could complicate things, honestly.

Answered By CloudWhiz88 On

Did you know you can actually combine presigned URLs with multipart uploads? There's a great blog post about it: aws.amazon.com/blogs/compute/uploading-large-objects-to-amazon-s3-using-multipart-upload-and-transfer-acceleration/. Before kicking off the upload, send the file metadata (like size) to your server. This way, if someone tries to send fake info, you can still limit their upload to, say, 200 MB. Also, be sure to check the file size after the upload, or use MD5 checks to ensure the uploaded file matches the metadata you received. If someone abuses the API, you can block them right away.

S3Expert12 -

Yeah, but we can't strictly enforce the exact size for each part with presigned URLs alone, can we? As long as we validate the size afterwards and keep the URL expiration short, it should help mitigate misuse.

Answered By JavaScriptJunkie On

Related Questions

Extract Audio From Video File

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.