Would anyone be interested in buying pre-built and pre-setup 4x3090/4090 or 8x3090/4090 PCs for training/inference?
Since I have been researching and building multi GPU PCs a lot, I was wondering if people here would also be interested in buying the hardware that we use as well.
We use RTX 3090/4090 GPUs since they have the best VRAM/$ and Perf/$ compared to any other cards, but because they have difficult to work with triple or 4-slot coolers we had to be creative to fit 4x GPUs in a 4U server chassis and 8x GPUs in a 6U chassis.
We are using X99 and C612 LGA2011-3 based systems for low cost and because the platform didn't affect performance much in our experience. Although substituting to Threadripper or whatever newer server CPUs is possible if needed.
We can also setup a Ubuntu 22.04 install that is pre setup with the correct drivers, CUDA and whatever training or inference open source software you want installed to reduce some setup headaches.
Considering that pre-built AI/machine learning PCs usually cost insane amounts like these examples below, I thought this might be something that some people might be interested in? We should be able to offer a 4x3090 machine for less than $5K with our custom case and PCIe cabling solution.
The other problem I see with these AI prebuilt PCs are that these builders offer weird GPU counts in 6x or 7x GPUs which are absolutely useless for inference or training when using tensor parallel (which you should be doing for such powerful machines) since it needs to be in powers of 2.
tinygrad: A simple and powerful neural network framework
AI and Deep Learning NVIDIA Workstations | Boxx Technologies