Hey everyone! I'm trying to use AWS Lambda for some heavy functions to save on costs compared to running a worker 24/7. However, I'm running into the 50MB max size limit for zip uploads because I use large libraries like pandas and playwright. I'm curious if there's a way to get around this limit? I've heard about using S3 buckets, but I'm not sure if that actually changes the size restrictions. Are there better options for handling this? Thanks in advance!
1 Answer
One option you could consider is using Lambda layers, but remember that the total limit for layers plus the main code is still capped at 250MB. If that doesn't cut it, you can create a Docker container for your Lambda, which sounds intimidating, but it expands your max deployment size to 10GB. Check out the official AWS docs for more details. Alternatively, you could explore using an EFS volume to store your libraries and mount that in Lambda, pulling libraries from there instead.

Thanks for the detailed reply! I’m definitely going to look into Docker; I was just concerned about potential cost increases. I haven’t heard of EFS before, so I appreciate the suggestion!