Viewing a single comment thread. View all comments

Care_Best t1_j17laxk wrote

With the number of people in this world that suffers from depression and loneliness, it's not ridiculous to assume that a substantial fraction of the population will reject reality for a virtual reality. Imagine when virtual reality gets to matrix level through some future iteration of nerualink (bci), where you can taste the virtual food and feel the virtual water on your skin. Couple that with an AI that you fell in love with that looks like what ever celebrity crush you have, and it's a alluring proposition to just reject reality all together and escape to paradise.

23

superhyooman t1_j17lf9f wrote

Hell, you don’t gotta be depressed for that to sound desirable!

23

angus_supreme t1_j18sby1 wrote

No but something tells me we'll be there before we make drugs that work lol

5

Sieventer t1_j193eln wrote

So many people are like that. Virtual reality would open doors to incredible experiences.

2

Pawneewafflesarelife t1_j17qazj wrote

This has the potential to really help with therapy and the mental health crisis, if good systems are developed. I would definitely use an AI therapist if they worked - a big issue with therapy is finding the right one you feel comfortable talking to who uses a method which clicks with you. Imagine if you could just try out a new digital personality or therapy style with a button click.

5

matt_flux t1_j180764 wrote

How will it help with therapy? A healthy mind needs challenges, not constant dopamine. This will lead to addiction and dependance.

1

Pawneewafflesarelife t1_j1bwrll wrote

Therapists are overloaded RL and many people can't afford them. An AI which can serve that role would help a lot of people work through issues. Don't understand the addiction and dopamine comment. Literally talking about therapy with an AI instead of a person.

1

FapSimulator2016 t1_j183k1a wrote

I’m not sure how helpful that would be, there’s something about not being understood by a person but an AI that makes therapy feel redundant, but maybe if the difference at that point is non-existent then it won’t really matter…

1

Pawneewafflesarelife t1_j1bwjft wrote

For me, I wouldn't mind knowing they are AI. For me, I have a lot of past trauma that I just don't even really think about which I need to work through, but I'm pretty good at analysing events and patterns in my life once I sit and think about them. So an AI therapist would be kinda like guided jounaling with different calls to action based on therapy style.

The lack of human element might make people more honest and earnest about treatment, too. When I was younger, I wasn't really honest in therapy - I didn't want the therapist to think badly of me and I was afraid I'd be locked up if I talked about dark stuff. OTOH, AI therapy might increase paranoid avoidance of therapy since transcription of the session would be instant. There would have to be huge protocols for privacy.

2