Why Does My GPU Use So Much Power at Idle with Two Monitors?

0
0
Asked By PixelProwler42 On

I've noticed that my GPU, which is an AMD 5700XT, draws about 30W of power at idle when I have two monitors connected — one is a 60Hz display and the other is a 165Hz. I even tried setting the refresh rate of the 165Hz monitor to 60Hz, but the power consumption didn't change. When I connected just the 165Hz monitor at 60Hz, the power usage dropped to around 5-10W. It's strange that having both monitors set to 60Hz still results in 30W at idle. Is this typical for a dual monitor setup, or is there something more going on?

3 Answers

Answered By TechieTed99 On

The power draw could be influenced by the type of connection you're using. Different ports, like HDMI or DisplayPort, might require the GPU to power additional circuits for the second monitor. So even if you're at lower refresh rates, there could still be a power increase due to this.

Answered By CuriousCarl23 On

It’s just how GPUs work, honestly. They often have to crank up their clock speeds to keep everything synchronized between the monitors, which could be causing the higher idle power usage. In the past, it was much worse, with some GPUs drawing significantly more power for dual setups.

LazyView34 -

That's frustrating! I didn't realize it was such a common issue.

Answered By DisplayDiva77 On

Having different refresh rates could also play a role. The GPU handles two clocks for the two different refresh rates, which can keep it from downclocking properly at idle, leading to higher power draw. Ideally, having them at the same refresh rate or a compatible ratio (like 60Hz and 120Hz) might help to reduce power consumption.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.