Submitted by Ortus12 t3_y013ju in singularity
sideways t1_irpfahg wrote
It has already started with apps like Replika.
At the moment, the human tendency to anthropomorphize is meeting language models halfway - but it won't be long until we're in Her territory. I'd expect many people to have a language model as their primary means of emotional support by 2030.
People are (correctly) alarmed by superhuman intelligence but I'm just as worried by superhuman charm, kindness, empathy and persuasiveness.
Reddituser45005 t1_irporok wrote
I’m the 70s or 80s they had a fad where people kept pet rocks. The bar is surprisingly low.
r0cket-b0i t1_irq6emb wrote
I have a pet rock...
myusernameblabla t1_irqcz5x wrote
Do you, like, err, you know … pet her?
r0cket-b0i t1_irqd81l wrote
Yes, mine is a He and it has an opening that looks like a mouth of a pacman with amethyst crystals in there, so I pet it for sucking the bad energy :)
RunItAndSee2021 t1_irramlr wrote
rock extra terrestrial boi
neo101b t1_irriq3d wrote
Then in the 90s we had digital pets.
Entire-Watch-5675 t1_irsdpse wrote
I love my pet deck.
freeman_joe t1_irr4nxg wrote
😂
Flare_Starchild t1_irqi91j wrote
What would you be concerned about superhuman kindness for?
sideways t1_irqjy9s wrote
Good point. Of course, ultimately, superhuman kindness is exactly what we want in an AGI. However, I think the *appearance* of superhuman kindness in "companion" language models would just be another kind of superstimulus that a normal human couldn't compete with.
If you spend a significant amount of time interacting with an entity that never gets angry or irritated, dealing with regular humans could be something you would come to avoid.
overlordpotatoe t1_irqlt39 wrote
Alternatively, they could make us better people by modelling behaviours like good conflict resolution skills and mindfulness.
sideways t1_irqlvlx wrote
You're absolutely right. I certainly hope it works out that way.
[deleted] t1_irpljhe wrote
[deleted]
KillHunter777 t1_irpm99g wrote
Why though? I personally see nothing wrong with it. If an AI can provide better emotional support than anyone else, then why not?
[deleted] t1_irpn7bw wrote
[deleted]
wordyplayer t1_irq2wuk wrote
what if your spouse is incapable?
Unumbium t1_irq31h8 wrote
Does it have to be only one or the other...?
imlaggingsobad t1_irpxpnl wrote
an AI could show more humanity than a human. Think about that for a second. Maybe these AIs become so intelligent and so enlightened that they make us look like barbarians by comparison. The most compassionate 'soul' on this earth could be an AI.
[deleted] t1_irpzdww wrote
[deleted]
wordyplayer t1_irq30ob wrote
you're the first person to mention "abandon". It was not part of the conversation until now. You can get emotional support from X, without abandoning Y.
sideways t1_irpmivv wrote
I'd expect many people to have both. What I'm concerned about is how, eventually, human companionship might just not be very compelling compared to a good language model.
An "AI" partner has no needs of its own. It can be as endlessly loving or supportive or kinky or whatever as you need it to be. Once they can also give legitimately good advice I can imagine a lot of people finding real human relationships to be not much more than a pain in the ass. Human relationships are hard!
IndependenceRound453 t1_irpo0uk wrote
I'm glad this concerns you. This should concern people.
And you're right, human relationships are hard. But that's the beauty of them, is that you have to work hard for them (Not so hard it's toxic, of course).
Another thing that makes relationships wonderful is that it's about your partner as much as you. If my former relationships had been only about me and not my partners, that would've been unbelievably boring and unbearable.
Idk, that's just my two cents. Like I said, I hope we don't reach a world where people choose to have an AI partner over a human one if the former is an option, but only time will tell.
ThoughtSafe9928 t1_irpqphg wrote
You take human error over an AI able to analyze literally everything and formulate the perfect solution/response?
IndependenceRound453 t1_irpstj9 wrote
Emotional support isn't mathematics, so your not gonna get a 100% perfect solution. And I wouldn't even mind consulting it, I just wouldn't ignore my partner in favor of an app.
Successful_Border321 t1_irqz70r wrote
The ‘majority of humans’ will never prefer artificial relationships. You dopes saying language models are equivalent to a living breathing partner are so pathetic it boggles the mind. Get out of your mom’s basement and go meet some people in real life. JFC.
everslain t1_irr6qi7 wrote
Gee why would people rather chat with a nice AI than humans like this
Successful_Border321 t1_irrqikp wrote
Truth is hard to read, I get it. And i believe and am totally fine with a large percentage of the male population being lost to sex robot girlfriends. But there is little to zero people on earth who have shared an intimate relationship with a living breathing human who would trade that with a computer who can mimic human interaction.
Viewing a single comment thread. View all comments