[D] chatGPT and AI ethics Submitted by [deleted] t3_11nenyo on March 10, 2023 at 4:29 AM in MachineLearning 11 comments 2
WH7EVR t1_jbngk56 wrote on March 10, 2023 at 8:19 AM It took about 120 GPU-years (A100 80GB) to train LLaMA. If you want to train it from scratch, it'll cost you a ton of money and/or time. That said, you can fine-tune llama as-is. No real point is recreating it. Permalink 1
Viewing a single comment thread. View all comments