Submitted by rustymonster2000 t3_11w8lp2 in MachineLearning
tungns91 t1_jcxkd5z wrote
Reply to comment by Civil_Collection7267 in [D] Best ChatBot that can be run locally? by rustymonster2000
Do you have specific chart between consumer hardware and performance of LLaMA 7B to 65B? Like I want to know if my poor gaming PC could have an response in under 1 minute?
Civil_Collection7267 t1_jczrmem wrote
Tom's Hardware has an article on that: https://www.tomshardware.com/news/running-your-own-chatbot-on-a-single-gpu
Viewing a single comment thread. View all comments