Scaling Docker Containers for Trading Bots: What’s the Best Approach?

0
4
Asked By TechieNinja96 On

I'm developing a platform where users can run their own Python trading bots, with each strategy isolated in its own Docker container. Given that I have 10 users each running 3 strategies, that totals to 30 Docker containers working at the same time. I'm wondering if this architecture is sustainable long-term.

Currently, I'm facing a couple of challenges:
1. When a user tries to stop all their strategies, the whole system lags, which happens because I'm shutting down all their Docker containers.
2. Additionally, I'm fetching user balances and related info every 30 seconds, and it feels like the web app is running slowly.

With plans to scale up to 500+ users, should I reconsider my entire setup? I'd love to hear any advice from anyone who's tackled similar scalability issues!

2 Answers

Answered By DockerDude21 On

For the stopping issue, consider not shutting down the whole container. Instead, you might just kill the Python scripts inside the container, keeping the container running. By preserving the container, you avoid some of the lag you're encountering.

CuriousCoder88 -

That makes sense! Just to clarify, my setup involves users interacting with an AI that generates Python code. Each code requires other common files (like loggers and APIs). I assumed separate containers for each strategy to manage those. It sounds like I might be overdoing it!

Answered By ScalableGuru99 On

While what you’re doing might work, it could become very costly with scaling. Instead of creating a new Docker container for each strategy, consider just running multiple Python processes instead. You'll save a lot on resources and keep costs down as you scale up to 500 users.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.