Submitted by Shardsmp t3_zil35t in MachineLearning
Fellow machine learning enthusiast here!
I want to train a large NLP model and I'm wondering whether its worth it to use Google Cloud's TPU's for it. I already have an Nvidia RTX 3060 Laptop GPU with 8.76 TFLOPS, but I was unable to find out what the exact performance (in TFLOPS to be able to compare them) of google TPU v3 and v4 are.
I know TPUs (I think the factor is 12x) are a ton faster and more optimized for machine learning than GPU's, but I'm still wondering whether its worth it to just build a graphics card rig for the long term. (since the pricing and estimation seems unclear to me since I cannot see how much I'm paying per TFLOP.)
Has anyone done the numbers on price/performance and hourly cost? Also is there any factor I missed? Thanks a lot in advance!
VirtualHat t1_izrki2g wrote
I would also like to know the answer to this...