Submitted by imgonnarelph t3_11yof4h in MachineLearning
👋 Hey all, we just launched ChatLLaMA. An experimental chatbot interface for interacting with variants of Facebook's LLaMa. Currently, we support the 7 billion parameter variant that was fine-tuned on the Alpaca dataset. This early version isn't as conversational as we'd like, but over the next week or so, we're planning on adding support for the 30 billion parameter variant, another variant fine-tuned on LAION's OpenAssistant dataset and more as we explore what this model is capable of.
If you want deploy your own instance is the model powering the chatbot and build something similar we've open sourced the Truss here: https://github.com/basetenlabs/alpaca-7b-truss
We'd love to hear any feedback you have!
Viacheslav_Varenia t1_jd91mou wrote
My first impressions.
Chatllama gives an accurate answer based on data later than 2021. And that's its advantage over chatGPT.
If you ask a general clarifying question, Chatllama loses the context and gives an irrelevant answer.