pitrucha
pitrucha t1_j9cdugi wrote
Reply to comment by nashcaps2724 in Fine tuning a GPT for text generation by nashcaps2724
more like: given item x from Corpus B, predict item y from corpus A
pitrucha t1_j9ccuun wrote
Reply to Fine tuning a GPT for text generation by nashcaps2724
proceed as with summarization. you can train gpt family using open ai or get yourself t5 and train it. training from scratch will be a waste of resources
pitrucha t1_jckiv1q wrote
Reply to [D] An Instruct Version Of GPT-J Using Stanford Alpaca's Dataset by juliensalinas
Any plans to quantize it? I saw that someone managed to do so with 65B LLama and push it from 120 to 30 GB