I'm using a CRT monitor, which can have image issues at higher refresh rates, and I've found shorter cords help reduce these problems. I'm considering using a riser cable to extend my R7 240 graphics card outside of my case for a direct connection to the CRT monitor using a cordless male-to-male VGA connector. I know this might look ridiculous, but it's worth it to me. I've seen riser cables that claim to have electromagnetic radiation shielding, which makes me nervous since CRTs emit a lot of electromagnetic radiation. Is my computer at risk of damage if I remove the side panel and have it pressed against the CRT monitor? Also, if I only need PCI Gen 3 x8 bandwidth, would a Gen 4 or Gen 5 cable provide better data integrity at lower bandwidth?
2 Answers
I recommend just getting a high-quality shielded VGA cable with ferrite beads on each end. It's a pretty easy solution that could save you from a lot of hassles down the line.
Yeah, PCIe signals can be impacted by electromagnetic interference. It's definitely a smart move to invest in a cable with good shielding.
What exactly happens to the signal if it gets affected by EMI? Could it damage my GPU? I'm looking to upgrade to reduce the analog issues, but I'm worried that it might just introduce digital artifacts instead.

I've already upgraded from a 6ft to a 1ft thick VGA cable, and there was a noticeable improvement, but I'm still seeing worse quality at 100Hz compared to 75Hz. I'm just trying to get the best quality I can.