Submitted by EntireInflation8663 t3_zr8zfc in MachineLearning
The main issue I have with GPT-3 is that the output can be compelling, yet factually incorrect. I remember discovering a platform that generates answers alongside sources, but I can't recall the name.
WaterAirFireEarth t1_j12dahs wrote
Sources does not = factually correct. Many humans that cite articles don’t read the full article, misrepresent the work, or are biased to (e.g.) famous or familiar authors.