Why does my script’s latency increase when running multiple instances?

0
0
Asked By CuriousCoder42 On

I'm facing a performance issue with my application where the latency spikes when I run several instances of a script simultaneously. I've created a simple demonstration using roaring bitmaps, but I observe similar latency increases even with basic array calculations. For each additional instance I run, there's roughly a 5% increase in latency. For instance, one instance of the script runs at about 5.4ms, while five instances take around 6.7ms each. In my actual application, with larger, sparser bitmaps and many instances, this translates to operations going from 400ms to 900ms, which is far from ideal. It's worth mentioning:

- I'm not using any network, disk I/O, or shared memory
- The processes aren't communicating with each other
- CPU scheduling doesn't seem to be an issue (as shown in my screenshot)
- Disabling SMT (AMD multithreading) yields the same results
- This is a Node.js script, but I've seen similar issues with a PHP version as well
- I'm on Rocky 9, with Node version 20.15

Here's my performance data:

`# one instance`
`result: 12502500 - elapsed: 5.443ms`
`# five instances`
`result: 12502500 - elapsed: 6.732ms`

2 Answers

Answered By TechWizard88 On

Check the overall CPU and memory usage on your host machine. Multiple processes need CPU resources, and if they’re crowding the cores, that might account for the increased latency.

Answered By ScriptSleuth33 On

One potential cause could be the CPU's cache hit ratio getting worse as you run more processes in parallel. When the cache can't keep up, that can lead to significant slowdown.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.