I'm curious if anyone here has experience building a system with dual RTX 5060 Ti graphics cards, specifically the 16GB versions. I'm looking to train some models and wondering what kind of performance I can expect. How large of a model can I realistically work with on this setup?
3 Answers
Honestly, I'd steer clear of dual 5060 Tis. Consumer GPUs don't have NVLink, which means you're stuck with 16GB of VRAM per model. This setup might introduce latency when training across two cards. A single card with a wider bus, like a 5080 or even a used 5090, could give you better performance without the hassle of managing dual GPUs.
It really depends on the type of deep learning tasks you're planning to tackle. For image classification or segmentation, dual RTX 5060 Tis should work fine. However, if you're diving into large language models (LLMs) or image generation, you might run into some issues. Utilizing dual GPUs is possible but not super common.
If you've got the 16GB version, you're in good shape! You might be able to format some sentences, but don't expect too much if you're using the 8GB. Just keep in mind the limits on what you can accomplish with those specs.

Related Questions
Lenovo Thinkpad Stuck In Update Loop Install FilterDriverU2_Reload