Submitted by zveroboy152 t3_zwtgqw in MachineLearning
Tom_Neverwinter t1_j1x5gyp wrote
Bought a few Tesla m40s now I need a motherboard with enough gpu slots. 1x slots have too much of a bottleneck whoops.
zveroboy152 OP t1_j1xi8km wrote
Hi Tom,
I ran into that problem too. I ended up getting one of these for my Tesla's and AMD MI25's:
SUPERMICRO 1027GR-TRF
(not sponsored)
https://www.theserverstore.com/Supermicro-SuperServer-1027GR-TRFT-1U-GPU-Server
It's been great for my GPU workloads.
ggf31416 t1_j21s4dy wrote
Not a great choice unless you really need the cheapest way to get to 24GB
https://www.reddit.com/r/MLQuestions/comments/rttzxg/tesla_m40_24gb_gpu_very_poor_machinelearning/
Tom_Neverwinter t1_j21sqit wrote
Need lots of vram for ai models.
Performance seems related to how you have it setup.
Proxmox and Xen seem to work well after some initial setup.
Viewing a single comment thread. View all comments