I'm on the hunt for a budget-friendly used GPU that has a good amount of VRAM for running local LLMs. I found a GTX 2080 Ti with 11GB of VRAM going for about 200 euros on eBay. However, I'm wondering if there are any other options out there with larger VRAM that won't break the bank, ideally sticking to around 300 euros. Any suggestions?
1 Answer
You might want to check out the RTX 4060 Ti, which comes with 16GB VRAM. It could be a solid choice for your needs!

Related Questions
Lenovo Thinkpad Stuck In Update Loop Install FilterDriverU2_Reload