Viewing a single comment thread. View all comments

sideways t1_irpfahg wrote

It has already started with apps like Replika.

At the moment, the human tendency to anthropomorphize is meeting language models halfway - but it won't be long until we're in Her territory. I'd expect many people to have a language model as their primary means of emotional support by 2030.

People are (correctly) alarmed by superhuman intelligence but I'm just as worried by superhuman charm, kindness, empathy and persuasiveness.

78

Reddituser45005 t1_irporok wrote

I’m the 70s or 80s they had a fad where people kept pet rocks. The bar is surprisingly low.

45

r0cket-b0i t1_irq6emb wrote

I have a pet rock...

17

myusernameblabla t1_irqcz5x wrote

Do you, like, err, you know … pet her?

13

r0cket-b0i t1_irqd81l wrote

Yes, mine is a He and it has an opening that looks like a mouth of a pacman with amethyst crystals in there, so I pet it for sucking the bad energy :)

20

neo101b t1_irriq3d wrote

Then in the 90s we had digital pets.

3

Flare_Starchild t1_irqi91j wrote

What would you be concerned about superhuman kindness for?

7

sideways t1_irqjy9s wrote

Good point. Of course, ultimately, superhuman kindness is exactly what we want in an AGI. However, I think the *appearance* of superhuman kindness in "companion" language models would just be another kind of superstimulus that a normal human couldn't compete with.

If you spend a significant amount of time interacting with an entity that never gets angry or irritated, dealing with regular humans could be something you would come to avoid.

21

overlordpotatoe t1_irqlt39 wrote

Alternatively, they could make us better people by modelling behaviours like good conflict resolution skills and mindfulness.

10

sideways t1_irqlvlx wrote

You're absolutely right. I certainly hope it works out that way.

5

[deleted] t1_irpljhe wrote

[deleted]

−2

KillHunter777 t1_irpm99g wrote

Why though? I personally see nothing wrong with it. If an AI can provide better emotional support than anyone else, then why not?

9

imlaggingsobad t1_irpxpnl wrote

an AI could show more humanity than a human. Think about that for a second. Maybe these AIs become so intelligent and so enlightened that they make us look like barbarians by comparison. The most compassionate 'soul' on this earth could be an AI.

8

[deleted] t1_irpzdww wrote

[deleted]

−1

wordyplayer t1_irq30ob wrote

you're the first person to mention "abandon". It was not part of the conversation until now. You can get emotional support from X, without abandoning Y.

4

sideways t1_irpmivv wrote

I'd expect many people to have both. What I'm concerned about is how, eventually, human companionship might just not be very compelling compared to a good language model.

An "AI" partner has no needs of its own. It can be as endlessly loving or supportive or kinky or whatever as you need it to be. Once they can also give legitimately good advice I can imagine a lot of people finding real human relationships to be not much more than a pain in the ass. Human relationships are hard!

6

IndependenceRound453 t1_irpo0uk wrote

I'm glad this concerns you. This should concern people.

And you're right, human relationships are hard. But that's the beauty of them, is that you have to work hard for them (Not so hard it's toxic, of course).

Another thing that makes relationships wonderful is that it's about your partner as much as you. If my former relationships had been only about me and not my partners, that would've been unbelievably boring and unbearable.

Idk, that's just my two cents. Like I said, I hope we don't reach a world where people choose to have an AI partner over a human one if the former is an option, but only time will tell.

1

ThoughtSafe9928 t1_irpqphg wrote

You take human error over an AI able to analyze literally everything and formulate the perfect solution/response?

4

IndependenceRound453 t1_irpstj9 wrote

Emotional support isn't mathematics, so your not gonna get a 100% perfect solution. And I wouldn't even mind consulting it, I just wouldn't ignore my partner in favor of an app.

1

Successful_Border321 t1_irqz70r wrote

The ‘majority of humans’ will never prefer artificial relationships. You dopes saying language models are equivalent to a living breathing partner are so pathetic it boggles the mind. Get out of your mom’s basement and go meet some people in real life. JFC.

−9

everslain t1_irr6qi7 wrote

Gee why would people rather chat with a nice AI than humans like this

6

Successful_Border321 t1_irrqikp wrote

Truth is hard to read, I get it. And i believe and am totally fine with a large percentage of the male population being lost to sex robot girlfriends. But there is little to zero people on earth who have shared an intimate relationship with a living breathing human who would trade that with a computer who can mimic human interaction.

−2