Submitted by [deleted] t3_zrx665 in MachineLearning
softclone t1_j16f79b wrote
Depends. https://vast.ai/ is great but has certain limitations. If you can run on 1-4 24GB 3090 RTX cards that's going to be the best value rolling your own. 4090s of course good too but you'd need to find a good deal to make it worth it vs the 3090s. You can always start with 1 and go from there. Otherwise you'll be paying 10X more for some A100s. First step is to get a real good handle on how much compute you're actually using and what the smallest gpu/vram size works efficiently for your data.
Viewing a single comment thread. View all comments