Submitted by LevelWriting t3_zk0qek in singularity
fingin t1_j031gnr wrote
Reply to comment by Relative_Rich8699 in Character ai is blowing my mind by LevelWriting
Even GPT-4 will make silly mistakes. That's what happens when a model is trained to find probable word sequeces instead of actually having knowledge of language like people do.
Relative_Rich8699 t1_j033bjo wrote
Yes. But I was speaking to "the company's" bot on purpose and I would only say that it should be trained with company data for those questions. When I inquire about ducks it can use the world's written word.
Viewing a single comment thread. View all comments