I'm curious about power consumption spikes for CPUs and GPUs. I've come across recommendations suggesting to increase your power supply wattage by 20% or even 40%, but they often lack solid backing. Is the Thermal Design Power (TDP) a reliable metric for understanding this, or does AMD's Adrenalin software's power limit slider actually push power consumption beyond what TDP indicates?
5 Answers
Remember that a PSU's efficiency rating is typically based on a 50% load. So, if you're pushing things, that's an important detail to keep in mind as well.
It's best to target around 60% load on your PSU to stay efficient. For a setup like a 300W GPU plus a 120W CPU and other components, a 750W PSU would put you comfortably above that mark. Going for a cheaper PSU might save a bit upfront, but the efficiency loss can add up over time.
Looking at just TDP isn't sufficient. Different power supplies behave differently under load. For instance, I've had a Seasonic Prime Titanium 750W that would shut down under stress with my setup, while a Corsair RM750x worked fine. It's really hit or miss unless you're testing with precise equipment.
Thanks for the insight! It really sucks that there's no good resource for power consumption spikes.
To get accurate readings, consider picking up a Kill-A-Watt meter or check UPS systems that display power draw.
If you have access to the hardware, the best way to find out is to use a power meter. You can track the idle consumption and then see how much it spikes while gaming. For instance, an MSI 5070TI is rated for 300W but usually idles much lower and peaks when pushed hard. Stick with the manufacturer's recommendations for the PSU for best results, especially if you plan to overclock.
Definitely, that's another aspect to watch out for!