thesupernoodle t1_jcsll6u wrote
Reply to comment by FirstOrderCat in Best GPUs for pretraining roBERTa-size LLMs with a $50K budget, 4x RTX A6000 v.s. 4x A6000 ADA v.s. 2x A100 80GB by AngrEvv
Sure; but the broader point is they can optimize their need with some cheap testing - is the model big enough such that is wants the extra ram of an 80Gig A100?
Viewing a single comment thread. View all comments