Viewing a single comment thread. View all comments

Nezarah t1_jdz1zqc wrote

For specifically personal use and research? And not commercial? LlaMA is a good place to start, and/or Alpaca 7B. Small scale (can run on most hardware locally), can be Lora trained and fine-tuned. Also has High token limits (I think it’s 2000 or so?).

Can have outputs comparable to GPT3 which can be further enhanced with Pre-Context training.

Can add branching functionality through the Langchain library.

6