[Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset Submitted by imgonnarelph t3_11wqmga on March 20, 2023 at 6:17 PM in MachineLearning 80 comments 292
2muchnet42day t1_jd3pu0m wrote on March 21, 2023 at 4:50 PM Reply to comment by benfavre in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph Can you train with 24 gigs of vram ? Permalink Parent 2
Viewing a single comment thread. View all comments