Submitted by Draconic_Flame t3_11rfyk2 in Futurology
Captain_Quidnunc t1_jc915vu wrote
I think you are horribly mistaken and "Therapist" will be in one of the first major rounds of job eliminations. Every study conducted on the topic shows people are more comfortable speaking to computer therapists than human therapists. And have improved outcomes as a result.
There's no feeling of being judged by a computer to overcome before progress can be made.
Many people have in fact been using AI as their therapist for a while now. And I'm certain the number will only exponentially increase as soon as famous people start recommending it as a low cost alternative to traditional talk therapy.
Plus...it's free, available on demand 24/7 and can be accessed from home.
I'm not sure how you think traditional therapists will be able to compete with that.
You will not be able to compete with that in a free market.
There simply isn't a better value proposition than free, whenever you want and from your couch. Unless you are planning on doing free, 24/7, on demand house calls.
So there are some jobs AI will struggle to replace. Like on site construction and maintenance work.
But "Therapist" isn't one of them.
It will likely be one of the first jobs on the AI chopping block.
MamaMiaPizzaFina t1_jc9plxc wrote
you should see my chatgpt chat history.
However, every second message it says to find a real therapists. so unless we are dealing with another AI that is trained to pretend to be a therapist and not suggest finding one. therapist might not be in as much danger.
However if there is a chat AI that is trained to pretend to be a therapist. and will not suggest contacting a real one. imagine the lawsuit and bad press as soon as one of their users (probably a few so maybe a class action lawsuit) commits suicide. imagine the parents and families with the chat history scrutinizing every chat log and blaming it for what happend.
[deleted] t1_jcaa7fr wrote
[removed]
Captain_Quidnunc t1_jcakff9 wrote
K.
You are listing a bunch of things that are completely irrelevant.
Nobody cares if AI gives them warning messages. And AI only gives you warning messages while the people who programmed it are worried about getting sued.
And it's not legally possible to sue an internet company due to section 230 of the communication decency act. So if consumers don't like them and they decrease profits, they will disappear.
Irrelevant.
Nobody thinks "real therapists" are effective to begin with. So they won't really expect AI therapists to be much if any better. So the bar for acceptance is remarkably low. And it's impossible to sue a "real therapist" if someone commits suicide while under their care.
So again, irrelevant.
If everyone who needed a therapist tried to get care from "real therapists" there would be a shortage of "real therapists" on the order of 30,000 providers at a minimum. With average wait times now of approximately 4-6 months to even get an appointment today. With 70% of therapists in most areas refuse to accept new clients. And most insurance makes it near impossible to get reimbursed.
So to the average person, seeing a "real therapist" isn't even an option.
And last and most important, healthcare in this country is a for-profit industry. The largest expense to any corporation is salary paid to skilled workers. And the more skilled workers they can eliminate from payroll, the more investors make.
So just like all other white collar work, the millisecond a company can fire every single skilled worker and replace their work with a free computer program they will. Because by doing so, the board gets a raise.
And they are well aware that we changed corporate law to make it impossible for individuals to sue companies for anything during the Bush administration. And since then the courts have upheld this.
So there aren't enough "real therapists" to meet demand in the first place.
Nobody cares about the warnings other than the annoyance and they won't last long.
Businesses profit from AI therapists and lose money creating or hiring more "real therapists".
And no company must, or does, fear getting sued because it's not possible to sue them.
Therefore the career "real therapist" will not survive the first round of mass layoffs any more than "real radiologist" or "real computer programmer".
It's a dead career. With a shelf life of approximately 3-5 years.
algoborn t1_jclmgkt wrote
Got any of these studies?
Captain_Quidnunc t1_jclzk6a wrote
"Got any of these studies?" Please.
Got any studies that say otherwise?
If you do...I'm sure the psychological community would love any data refuting that humans are more comfortable talking to computers than them. Granted the Google search history, let alone AI chat logs, of any living human would immediately falsify that data and render it moot. But I'm sure they would love to hear about it.
If you are going to search for this sort of data I would actually suggest Consensus. It's better for finding peer reviewed data than Google.
And make sure to differentiate between studies gauging public preferred response to observed data. Because that's the other side of this coin. We lie to our doctors. We tell the truth to computers.
[deleted] t1_jdf7xa8 wrote
[removed]
NanditoPapa t1_jc943ro wrote
I would love an AI therapist. But, that said, I think the majority are more comfortable with sharing emotional states with other humans. Yes, telling deep dark secrets might be easier to something you're certain won't judge you, most of the time therapy is about mundane but relatable issues people are trying to connect and process. Communicating genuine empathy or sympathy isn't likely to happen soon because it will need time for acculturation, people will have to grow up being told how to interact emotionally with an AI therapist.
Viewing a single comment thread. View all comments