I've noticed a significant increase in my GPU temperatures, going from around 61-63°C during full load in the RDR2 benchmark to about 71-73°C now. My GPU usage is still hovering around 99%, and I'm sticking with the same undervolt and fan curve, plus my hotspot delta remains at 10-13°C. The only major change is that I've switched from a 1080p monitor to a 1440p one. I also used Arctic MX-6 thermal paste a few months ago, which I expected would last longer than this. I'm curious if the switch to 1440p can really raise temps this much, or if it might be due to something like paste degradation, mounting pressure, or dust. Additionally, does power consumption increase at higher resolutions even if GPU usage is still at 99%? Just for context, I'm running an RTX 3080 Gigabyte Turbo OC. Any insights would be great before I consider repasting or opening up my card again!
5 Answers
Have you looked into the power draw differences between 1080p and 1440p? That could give you valuable insight into how much extra load the higher resolution is putting on your system.
The increased temperature is likely due to the resolution change. Higher resolution definitely makes the GPU work harder. Even if your usage reads 99%, it's pushing the card more in terms of workload.
You might want to try temporarily switching the resolution back to 1080p and see if your temperatures drop back down. Sometimes, it's just a quirk with Windows. But at least you're not hitting thermal throttle, so that's a good sign!
Are you talking about GPU temperatures? Just checking since that can make a difference in how you interpret the numbers overall.
What are your ambient temperatures? Sometimes people forget that as spring arrives, overall room temperature can rise, which could also contribute to the higher GPU temps.

Related Questions
Lenovo Thinkpad Stuck In Update Loop Install FilterDriverU2_Reload