I'm running a NestJS application along with a PostgreSQL database on a single VPS server. I'm considering using cluster mode for my NestJS app to maximize CPU utilization, but I've read that the optimal number of Node.js instances usually matches the number of CPU cores. Since most of my workloads are database-intensive, I'm unsure if this is the best approach. Is there a way to effectively monitor the workload between my NestJS app and the database?
3 Answers
To figure out the right number of Node.js instances, stress testing can be really helpful. This way, you can see how your app handles under pressure, especially since you're working with server-side applications. Monitoring CPU and memory usage during the tests will guide you on how many instances you should run based on your actual performance needs.
Finding the sweet spot for CPU utilization in Node.js isn't straightforward. If you go with one instance per CPU, it may lead to high utilization, which can bog down the event loop and slow down response times. On the other hand, having multiple instances per CPU can lead to competition for resources. It's all about balancing load without overloading your server.
If your database is the bottleneck, maybe it's time to optimize your PostgreSQL settings instead. There's often a lot you can do to improve performance on the database side before throwing more Node.js instances at the problem.

Exactly! And don't forget about sudden RPS spikes. You might want to consider a formula to help estimate your optimal resources: Response time SLO = (RPS * request CPU cost) / available CPU. It’s all about ensuring you have enough resources for good response times.