Hey everyone! I'm looking for some clarity on my setup. I have two 4K TVs connected to my PC: one runs at 120Hz and the other at 60Hz. While I was tweaking settings in the Nvidia program for my main display, I noticed there's a distinction between 'UHD' resolutions (like 4K x 2K, which is 3840 x 2160) and another set called 'PC.' When I select the UHD resolution, I can't set my refresh rate to 120 or 144Hz. But when I choose from the 'PC' list, I can select the 4K resolution and go up to 120 or 144Hz without a problem.
From what I've read, you need an HDMI 2.1 cable to handle UHD at 120 or 144Hz, yet everything works fine when I use the PC 4K resolution and push it to 144Hz. I'm confused because all the sources I consulted suggest that UHD and PC resolutions are essentially the same, but I see a difference in my outputs. Also, when I select the PC resolution and set it to 144Hz, the 'UHD' label disappears on my TV, even though it still shows my resolution as 3840 x 2160 and HDR is still active. Can anyone help me understand what's going on? I'm autistic, so some of this tech jargon can be really tricky for me to decipher. Thanks a bunch!
3 Answers
The confusion comes from how these resolutions and refresh rates are classified. Often, older 4K TVs were only capable of 60Hz, so that’s the maximum refresh rate you’d see in the UHD settings. Newer TVs now support higher refresh rates, which is why your TV lets you select 120Hz under the PC resolution.
Also, UHD refers to a resolution of 3840x2160, while true 4K can mean 4096x2160. They’re similar enough that people mix them up, but technically they’re not quite the same. Stick with the settings your TV supports to get the best experience!
About the 'UHD' label disappearing with the PC setting, it's likely just how your TV recognizes the input signal. When you switch to the PC resolution at higher refresh rates, the TV might categorize it differently, leading to the label change. It’s kind of a quirk of how the TV processes signals, but it doesn’t affect the quality of what you’re seeing as long as everything is running smoothly. Just keep an eye on your image quality to ensure all is well!
Regarding your HDMI cable, you're right to consider upgrading to an HDMI 2.1 cable to fully utilize 4K at higher refresh rates. Many older cables, including the Rocketfish one you're using, might not support both 4K HDR and 120Hz at the same time. That could definitely be the reason you see a difference in performance when switching between resolutions. So, if you want to ensure optimal performance, especially for gaming or high-speed content, investing in a proper HDMI 2.1 cable might be wise.
Thanks for breaking that down! It’s good to know that these classifications are just a bit outdated. I guess if my TV allows higher refresh rates under the PC section, I might as well take advantage of that, right?