I've been seeing something odd where several games seem to perform better at 1440p, getting 20-30% higher FPS compared to running them at 1080p. This isn't just an isolated case; I've checked multiple sources, including YouTube videos, and this trend holds up across the board. I'm really struggling to grasp the explanations, which usually mention 'CPU bottleneck.' I'm stuck on why increasing the resolution would seem to reduce the workload on the CPU. If the CPU is supposed to be the limiting factor, shouldn't it still be bottlenecking even at higher resolutions? I get that a higher resolution means the GPU has more work to do, but shouldn't the CPU still have to handle the same processes? What specifically does 'less work for the CPU' even mean in this context? Can someone explain this in more detail?
1 Answer
Think of it this way: Imagine Mom (the GPU) is preparing dinner while Dad (the CPU) is out shopping for ingredients. At 1080p, Mom can whip up smaller dishes quickly, but she's stuck waiting for Dad to get back with the groceries, which slows everything down. But when you up the resolution to 1440p, the meals become bigger and take longer to cook. This gives Dad more time to return with his stuff, so they can work more smoothly together, making your game feel much faster. In other words, while the CPU does have work to do, the increase in resolution allows better utilization of resources, leading to smoother gameplay overall!
I see what you mean! So it's more about how they work together rather than just about the CPU working less at higher res?