Can I Use Two Different 3060 Ti GPUs for LLM Work?

0
0
Asked By TechieNinja29 On

I'm wondering if I can pair my 8GB 3060 Ti with a 12GB 3060 Ti for running LLMs. Will they work together effectively? What should I keep in mind?

3 Answers

Answered By RenderMasterX On

You can definitely utilize both GPUs for different workloads, like CUDA rendering or machine learning tasks. Just keep in mind that for most gaming, you'd be limited to using just one of the GPUs, either the 8GB or the 12GB. So, if gaming's a priority, you might want to upgrade to a stronger single card instead.

TechieNinja29 -

Got it! I was mainly thinking of multitasking with different applications.

Answered By GamerGuru88 On

Using two different GPUs won't help with gaming performance, especially since SLI is pretty much a thing of the past. If you're trying to boost your setup for something specific like LLM work or rendering, it really depends on what you're planning to do. Just having both in your system means they can handle different tasks, but games will only use one GPU at a time. You might want to consider a single, more powerful card instead if that's your main goal.

PixelProwler42 -

Exactly! Plus, these days most modern setups don't need SLI anyway, unless you're doing something super specific.

Answered By OldschoolRig On

Not the best idea for gaming, since SLI is outdated. However, for LLMs or AI work, using two cards could help depending on the applications you're using. Just do a bit of research on whether the specific models you want can actually take advantage of both GPUs. If not, consider saving up for just one high-end card instead. That's usually the way to go for gaming setups nowadays.

ByteBender11 -

Totally agree! A single powerful GPU is generally a better investment.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.