I'm working on a personal project that relies on a Postgres database and two custom applications—one in Python and the other in C++. The setup makes a GET request every minute and processes a moderate amount of data (about 14 GB monthly), running an analysis program every minute. I'm planning to deploy this using Docker Compose. Initially, I considered getting a NUC for its decent CPU power to run it at home. However, I have little experience with cloud providers and deploying custom images through Docker Compose. So, I'm curious about the best approaches and what the community suggests for keeping this project running 24/7.
5 Answers
For something more robust, AWS ECS might be the right way to go. You might need to read up a bit on tutorials, but it allows you to distribute your workloads across multiple services. For example, you could have one service for your GET requests and another for data processing. This way, if something fails, you can still keep your data intact without losing any progress.
You might want to try using a Raspberry Pi if everything runs locally and doesn’t need to be accessed externally. They’re energy-efficient and can handle your type of workload quite well. Just keep in mind, avoid putting the Postgres data on an SD card since they can wear out quickly with constant writing. Use an M.2 SSD instead if you can find one. Plus, since you’re the only user, you don’t have to worry about advanced setups; you can easily transfer your Docker images to the Pi over your local network.
If you're looking for something a bit more powerful, consider renting a VPS. This can give you the resources you need and keep everything online. Cloud hosting options like DigitalOcean or Vultr allow you to run your servers in Docker containers and are generally easy to set up. They also provide the added benefit of being accessible from anywhere.
Have you thought about using a service like Heroku? It simplifies the process of managing Docker and PostgreSQL, although it can be pricier than some other options. But if you want convenience, it could be worth the cost. Plus, all your resources can be managed in one place, which can save you a lot of hassle.
Running it on your own machine is definitely an option if you’re nervous about a 24/7 system. Just make sure it can handle the uptime and load. If you're working with a lot of live data, you might want to set up a server that can handle potential issues like memory leaks by restarting it occasionally.
Related Questions
How To: Running Codex CLI on Windows with Azure OpenAI
Set Wordpress Featured Image Using Javascript
How To Fix PHP Random Being The Same
Why no WebP Support with Wordpress
Replace Wordpress Cron With Linux Cron
Customize Yoast Canonical URL Programmatically