Submitted by zveroboy152 t3_zwtgqw in MachineLearning
ggf31416 t1_j21s4dy wrote
Reply to comment by Tom_Neverwinter in [R] PyTorch | Budget GPU Benchmarking by zveroboy152
Not a great choice unless you really need the cheapest way to get to 24GB
https://www.reddit.com/r/MLQuestions/comments/rttzxg/tesla_m40_24gb_gpu_very_poor_machinelearning/
Tom_Neverwinter t1_j21sqit wrote
Need lots of vram for ai models.
Performance seems related to how you have it setup.
Proxmox and Xen seem to work well after some initial setup.
Viewing a single comment thread. View all comments