Viewing a single comment thread. View all comments

IronJackk t1_ixa906n wrote

There was an experiment a researcher did years ago where he offered $500 to any participant who would win his game. The game was that he played the role of a sentient ai trapped in a computer, and the participant played the role of a scientist who was chatting with the ai. If the ai could successfully convince him to let him escape using only text, and no bribes or threats, then the ai won. If the participant still refused to let the ai escape after 2 hours, then the participant won $500. The ai almost never lost.

I am butchering the details but that is the gist.

3