Modern gaming monitors feature incredibly high refresh rates, reaching up to 240 Hz or even 360 Hz. However, many users claim they can't perceive any significant difference beyond 144 Hz. I'm curious about the reasons behind this: is it due to the limitations of LCD backlights, or is our visual system's processing speed the real bottleneck?
5 Answers
There's this phenomenon called persistence of vision which allows us to perceive motion from still images shown rapidly. However, there’s a limit where we can’t perceive any difference as the refresh rate increases. Each person might tolerate flicker differently, but generally, the advantages diminish after a certain point. Know when to stop spending your money!
You’d be surprised how much refresh rate affects perception! It's like the limitations in hearing certain frequencies; at some point, investing more doesn't bring noticeable benefits.
Our eyes are pretty much like analog cameras that convert light into electrical signals via chemical reactions. But there’s a limit to how fast these reactions can occur, which plays a big role in how we perceive refresh rates.
It mainly comes down to biology. After a certain refresh rate, our eyes and brains just can’t process more frames effectively. For me, the shift from 144Hz to 240Hz was hardly noticeable compared to the jump from 60Hz to 144Hz.
A lot of folks haven’t actually experienced a controlled test to see if they can tell the difference. There’s a study by Linus Tech Tips where participants noticed differences up to about 240-300 Hz, impacting their gaming performance. Still, the eye's chemical changes can only keep up to a point, so while you might see a flash, you won't process 500 images per second because of the afterimage effect. It’s all about how quickly your eyes respond with each new image!
Exactly, it's fascinating how the speed of an eye affects flicker perception, especially under different brightness levels.

Totally agree, that explanation makes perfect sense!