Seankala OP t1_j9np9ae wrote
Reply to comment by chogall in [D] 14.5M-15M is the smallest number of parameters I could find for current pretrained language models. Are there any that are smaller? by Seankala
I guess at least 100M+ parameters? I like to think of the BERT-base model as being the "starting point" of LLMs.
FluffyVista t1_j9ottk1 wrote
probably
Yahentamitsi t1_j9xi4q4 wrote
That's a good question! I'm not sure if there are any pretrained language models with fewer parameters, but you could always try training your own model from scratch and see how small you can get it.
Viewing a single comment thread. View all comments