[D]GPT-4 might be able to tell you if it hallucinated Submitted by Cool_Abbreviations_9 t3_123b66w on March 27, 2023 at 4:21 AM in MachineLearning 100 comments 634
Chabamaster t1_jdxaqdd wrote on March 27, 2023 at 9:33 PM The fact that people call wrong answers a hallucination now seems very weird to me because it sounds like a marketing term to make the model seem smarter/conscious Permalink 6
Viewing a single comment thread. View all comments