What’s the Best Used GPU with Plenty of VRAM for Local LLMs?

0
2
Asked By TechSavvy42 On

I'm on the hunt for a budget-friendly used GPU that has a good amount of VRAM for running local LLMs. I found a GTX 2080 Ti with 11GB of VRAM going for about 200 euros on eBay. However, I'm wondering if there are any other options out there with larger VRAM that won't break the bank, ideally sticking to around 300 euros. Any suggestions?

1 Answer

Answered By GigaGamer99 On

You might want to check out the RTX 4060 Ti, which comes with 16GB VRAM. It could be a solid choice for your needs!

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.