Submitted by vom2r750 t3_yqqyo7 in singularity
visarga t1_ivt8r4i wrote
Reply to comment by [deleted] in They Put GPT-3 Into That Robot With Creepily Realistic Facial Expressions and Yikes by vom2r750
More recently GPT-3 can load 4000 tokens in the context. If you have a dataset of texts you can make a search engine that will put the top results in the context. Then GPT-3 can reference that and answer as if it was up to date.
Using this trick a 25x smaller model could have similar results with a big model, they had 1 trillion tokens of text in the reference.
Akimbo333 t1_ivxeakg wrote
Wow!
Viewing a single comment thread. View all comments