Viewing a single comment thread. View all comments

Marchello_E t1_j738ab0 wrote

ChatGPT is already able to convince people on having the right information. It is trained to find statistical correlations in language, not truths. For now it gets information from non-AI inspired sources, but who actually knows what these sources are. The more often certain information gets repeated, the more likely it is to end up as a source of training data. When more and more articles in the near future get written by an AI (not necessarily the same) the validity of the constructed narrative will start to spiral downwards at an alarming rate as the "source" simply gets reinforced by its own wackyness.

9