Submitted by Shardsmp t3_zil35t in MachineLearning
HateRedditCantQuitit t1_iztdm8c wrote
> but I'm still wondering whether its worth it to just build a graphics card rig for the long term.
Pretty much never, assuming it's for personal use.
If you're going to use this rig exclusively for ML, then maybe it still makes sense. The calculation becomes simple: cost to buy + energy cost to use * amount of use before it doesnt fit your needs
vs cloud cost
. If you use it enough for this to make sense, you might also be surprised how quickly you outgrow it (e.g. maybe you'll want to run some experiments in parallel sometimes, or you want to use models bigger than this thing's VRAM in a year or few).
If you want to use it for non-ML use, then no just use the cloud. If you're using it enough that the above calculation says to buy, then you won't actually get to use it for non-ML use, which will just annoy the hell out of you.
Shardsmp OP t1_izy4io0 wrote
haha ty
Viewing a single comment thread. View all comments