I'm planning to build a new PC for some AI projects and want a setup that can accommodate up to 3 GPUs in the future. Currently, I'm searching for a compatible tower and motherboard that can fit and adequately cool 3 GPUs along with space for 3-5 SSDs. Also, I'm curious about water-cooling options for GPUs, as it's been a while since I built a computer. Are there any benefits to going with water cooling, and can it work well in a compact tower?
5 Answers
It's great that you're planning for expansion! You'll need to consider the thickness of the GPUs and how many PCIe lanes you'll require. If you're looking at high-end cards, you might need to move into the HEDT (High-End Desktop) territory, which typically requires a CPU and motherboard designed for more lanes and slots, like a low-end Xeon W or Threadripper. Liquid-cooled GPUs can be an excellent choice since they free up space by eliminating the bulky air coolers and can run cooler since they're not competing for airflow. The Thermaltake Core W200 is a solid option if you want plenty of room for radiators and cooling components while also accommodating large GPU setups.
With current prices, workstation GPUs are comparable to consumer ones right now. You might want to consider investing in a single PRO 6000 card instead of three 5090s; it could save you quite a bit!
Definitely think this through since it can get quite pricey! To effectively run three GPUs, you should aim for a "prosumer"-sized motherboard like CEB or EEB, especially if you're considering an AMD Threadripper or Intel Xeon. Cases like the Fractal Define 7 XL are great since they support larger motherboards. And yeah, given the tight fit, you might need to water cool at least two of your GPUs to maintain efficiency.
Fractal Design Define XL is a solid choice for your build! Just ensure that you account for the motherboard layout and how many slots it uses, as that will affect GPU spacing.
I actually ran 3 3090s in an EVO XL Dynamic case, with a vertical mount for the third GPU. It did get a bit toasty under load, but it managed just fine for continuous use on several projects, each GPU pulling around 250-275w.

Thanks! I appreciate the heads-up on PCI lanes. I understand that additional AI GPUs only need a single lane, but for optimal performance, I might need to think about 16 lanes for video and 16 for my main AI GPU. So, I might end up with only 2 GPUs or have to set up a separate AI rig.