Is TypeScript Suitable for Handling 50K Rows on AWS ECS?

0
14
Asked By CuriousCoder23 On

I'm developing a task on Amazon ECS using TypeScript, where I'll fetch data from an external API, compare it with a DynamoDB table, and send any new or updated rows back to the API. The dataset I'm working with consists of about 50,000 rows and around 30 columns. I've successfully implemented similar logic in Python with libraries like pandas and polars before, but I'm leaning towards TypeScript because of the existing abstractions around DynamoDB access and the AWS CDK infrastructure we have in place. Given the data size and the complexity of the differences I'm calculating, I'm wondering if TypeScript is a good fit for this job on ECS or if I should be looking for alternatives. Have any of you dealt with similar situations?

5 Answers

Answered By OptimizedNinja On

50K rows isn't much to worry about if your processing logic is efficient. Just make sure your algorithms are optimized; otherwise, you'll be fine!

JuniorDev87 -

True, but if you're doing complex joins, you'll need to optimize those to keep things running smoothly!

SkepticalDev22 -

Yeah, no need to overthink this. Just keep it simple!

Answered By LambdaLover88 On

I think using ECS for this might be overkill given the size of your dataset. You could consider AWS Lambda with Step Functions for a faster processing solution. It would be cheaper and easier for this level of data handling!

PracticalDev44 -

Not sure about that; a 50K row dataset doesn’t need that level of complexity. ECS tasks can handle it easily, especially if you're already set up there.

CasualCoder11 -

I agree. You might be complicating things unnecessarily with Lambdas for such a simple pipeline.

Answered By PythonFanatic42 On

Honestly, Python doesn't have an edge over TypeScript for your case. If you were to use something like Java or Golang with multithreading, you might notice a small performance difference, but with TypeScript, it should be negligible. Just go with what you're comfortable with!

Answered By ECS_Enthusiast On

If you're using Node.js, it scales well beyond what you need. As long as you don't have strict latency requirements, this setup should totally work, assuming you implement it competently.

Answered By DataDynamo99 On

We handle ETL pipelines with 100 million records using TypeScript; you'll be just fine with 50K rows!

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.