I recently switched from a Radeon 9070 XT to an Nvidia 5070 Ti due to some issues with the former and a great deal I found. After benchmarking both cards, I noticed that the VRAM usage was consistently higher by 1-3 GB on the 9070 XT, even under the same settings. This discrepancy varies by game as well. I'm curious if this is just how these cards operate, if my 9070 XT was faulty, or if there's an explanation for the differences in VRAM utilization.
5 Answers
Nvidia claims to have developed a new compression method that uses less VRAM. I think their advanced compression algorithms give them an edge in managing VRAM better than Radeon, which often just offers more VRAM to deal with the same settings.
These VRAM differences have been common over time, and honestly, only driver developers can give you all the nitty-gritty reasons. Both cards react differently to the same games, and differences in game optimization can explain a lot.
VRAM usage isn’t a fault indicator for your card at all. In fact, you want as much VRAM to be utilized as possible. It’s the fastest RAM available, making games run smoother. Just keep in mind that the software and drivers have their own decisions on what gets loaded and when, so usage can change depending on various factors.
Using more VRAM isn’t a bad thing. Unused VRAM is effectively wasted. As long as your usage isn’t hitting the maximum, it’s smart to keep whatever data you can in VRAM. Plus, it allows for more mods and tweaks, like in Cyberpunk for example!
The difference in VRAM usage is actually pretty normal due to the different architectures and drivers that Nvidia and Radeon use. They manage memory and textures in their own ways. So, you probably don’t need to worry about it too much. It’s just how they handle things differently.

Related Questions
Lenovo Thinkpad Stuck In Update Loop Install FilterDriverU2_Reload