Viewing a single comment thread. View all comments

Captain_Clark t1_j99bx6b wrote

This is nothing new. ELIZA had similar effect upon users decades ago, despite its far cruder capabilities at language construction.

>>Shortly after Joseph Weizenbaum arrived at MIT in the 1960s, he started to pursue a workaround to this natural language problem. He realized he could create a chatbot that didn’t really need to know anything about the world. It wouldn’t spit out facts. It would reflect back at the user, like a mirror.

>> Weizenbaum had long been interested in psychology and recognized that the speech patterns of a therapist might be easy to automate. The results, however, unsettled him. People seemed to have meaningful conversations with something he had never intended to be an actual therapeutic tool. To others, though, this seemed to open a whole world of possibilities.

>> Weizenbaum would eventually write of ELIZA, “What I had not realized is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.”

ChatGPT is lightyears beyond ELIZAs capabilities. But Weizenbaum’s concerns remain, and it’s how we got here; to a point where you are entranced in exactly the same way ELIZA’s users were.

32

Master00J t1_j99ycse wrote

I think this tells us a little about the nature of therapy, really. I see therapy not as a conversation, but as a tool for YOU to organise your OWN thoughts. Therapy capitalises the animalistic human instinct of communion and comradery in order to allow us to ‘open up.’ Half the job of a therapist is simply being present. I imagine if we had a 100% realistic imitation of a human made out of wax, and simply told the patient it was a very very quiet therapist, and compare that to if we told the patient to speak into a microphone in a room alone, we would see far greater results in the former.

11

Captain_Clark t1_j9ay4xy wrote

What you’re describing is also what those who’d supported the idea that an “electronic therapist” may provide benefits to a suffering person have suggested.

There are indeed possibilities here; though I’d say there seem as many pratfalls.

You are correct in saying that a cognitive therapist is a listener. But they’re a trained, professional listener, who is attuned to the nuances of sentience. A cognitive therapist will listen so well that they’ll be able to point out things you’ve repeated, associations you’d made, and indicate these to you.

eg: “You’ve mentioned your mother every time you’ve described the difficulties in your relationships.” or “You’ve mentioned your uncle three times and began fidgeting with your clothing. What can you tell me about him?”

So yes, it’s a job of listening. But it’s listening very attentively, and also watching a patient as they become tense, or struggle for words. It’s observing. The reason that therapist is a highly trained observer is because we don’t observe ourselves, don’t recognize our own problematic patterns. Because maybe that uncle molested the patient and the patient is repressing the memories, while still suffering from them.

A Chatbot may be a good venue for ourselves to vent our feelings and maybe for us to recognize some of our patterns though I suspect we’d not do that very well because we’re basically talking to ourselves, while a bot which can’t see us and has no sentience responds to our prompts. We already can’t see our patterns. Nor will ChatGPT, which does not retain previous chats. One could write the same irrational obsession to ChatGPT every day, and ChatGPT will never recognize an obsession exists.

It’s writing therapy, I suppose. But does it provide guidance? And can it separate our good ideas from our harmful ones? I’m doubtful about that and if it could be trained to, such a tool could actually be employed as a brain-washing machine. I don’t consider that hyperbole: Imagine the Chinese government mandating that its citizens speak with a government Chatbot. They already have “re-education” camps and “behavioral ranking” systems.

I’m reminded of this scene.

3