I'm experiencing major issues while trying to play Red Dead Redemption 2 on my Gigabyte RTX 5080 Gaming OC. Despite running the game at 1440p with optimized settings (not ultra or 4K), I keep getting a warning stating that I'm using too much VRAM. This isn't just a mild inconvenience; the game's performance is so bad that it's practically unplayable. I've updated to the latest NVIDIA drivers and haven't changed any settings that could cause this. Is anyone else with a 5080 or even a 5090 encountering similar problems? Could this be a bug related to VRAM allocation, or is RDR2 just not working well with the 50-series cards?
4 Answers
Try using MSI Afterburner during gameplay. It can give you real-time stats to check if the game is really maxing out your VRAM. You might also find some specific fixes in RDR2 forums since this seems more game-related than a GPU issue.
It might be a driver bug with the 5000 series. My 4070Ti runs RDR2 smoothly at 4K, and I don’t think it uses more than 8GB of VRAM for that game. You could check if there are known issues with your current drivers as well.
Just a thought, but is your monitor connected to your motherboard instead of the graphics card? That could definitely be causing some issues. Also, it sounds like this might be a known problem with RDR2 and the 50-series GPUs according to another user.
Have you looked into your BIOS settings? You might want to disable onboard graphics, as RDR2 sometimes defaults to that instead of using your dedicated GPU, which could be the root of your problem.
Nope, it's plugged into the graphics card. I've been gaming without any issues on other titles.