rhofour
rhofour t1_j14nwp0 wrote
Reply to [D] What GPT-esque model/platform returns peer-reviewed sources with outputs? by EntireInflation8663
https://www.perplexity.ai/ is trying to do this by combining GPT-3 and Bing search results. Asking a question like "when did Abraham Lincoln land on the moon?" reveals there's a lot of work to be done here.
As far as I know no one has a good language model that can attribute its responses yet.
Edit: I reread what you wrote and I realized you're looking specifically for peer reviewed sources. I'm not sure such a system exists yet. Galactica by Meta might be the closest, but it was only trained over a collection of academic sources that had appropriate licensing terms.
rhofour t1_j153vzt wrote
Reply to [D] Build a home PC to Run Large GPT Models or use AWS by [deleted]
GPT is not a public model. You can't train or run it yourself.
I just checked and saw OpenAI does have a fine tuning API so you can fine tune and use the model through their API, but your hardware doesn't matter.
You can look at open source reproductions of GPT like OPT, but it will be very expensive to get the hardware to run the model, let alone train it. If you really want to use one of these huge models yourself (and not through an API) I'd advise starting with AWS before you consider buying any hardware.