keeplosingmypws
keeplosingmypws t1_jdtu810 wrote
Reply to AI being run locally got me thinking, if an event happened that would knock out the internet, we'd still have the internet's wealth of knowledge in our access. by Anjz
Thought the same thing. Compressed, imperfect backups of the sum total of human knowledge.
Each local LLM could kickstart civilizations and technological progress in case of catastrophe. Basically global seed vaults but for knowledge and information.
keeplosingmypws t1_jd9wpwm wrote
Reply to comment by KerfuffleV2 in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
Thanks for leading me in the right direction! I’ll letcha know if I get it working
keeplosingmypws t1_jd5xygm wrote
Reply to comment by KerfuffleV2 in [Project] Alpaca-30B: Facebook's 30b parameter LLaMa fine-tuned on the Alpaca dataset by imgonnarelph
I have the 16B parameter version of Alpaca.cpp (and a copy of the training data as well as the weights) installed locally on a machine with an Nvidia 3070 GPU. I know I can launch my terminal using the Discrete Graphics Card option, but I also believe this version was built for CPU use and I’m guessing that I’m not getting the most out of my graphics card
What’s the move here?
keeplosingmypws t1_jdvkq3a wrote
Reply to comment by GoldenRain in AI being run locally got me thinking, if an event happened that would knock out the internet, we'd still have the internet's wealth of knowledge in our access. by Anjz
Oh great call