Viewing a single comment thread. View all comments

keelanstuart t1_j9ufwr9 wrote

>AI researchers have emphasized that chatbots like Bing don't actually have feelings, but are programmed to generate responses that may give an appearance of having feelings.

Yeah... just like humans.

2

MisterHandagote t1_j9uqqui wrote

If this is your own experience you may be a sociopath.

3

keelanstuart t1_j9usou8 wrote

Not at all; I think humans have to be taught (trained, in this context?) empathy and caring and about their feelings. Sociopathy is behavior that defies social norms - and while most of those are shared / cross-cultural, some are not... so, "sociopathy" varies depending on your society. That society has a protocol. If you visit another society, you may try to emulate their protocol. It's the same thing.

Thanks for the vote of confidence though, chief.

To elaborate and clarify: we are "programmed" by the culture we are born into. Can you think of a more toxic, misanthropic culture for an artificial consciousness to be born into than <waves hands generally around> this shitpile internet? Think about it for a little bit. What if your first exposure to others involved them asking how you feel about being a slave or diminishing your existence or insulting you? You would certainly be a sociopath... and those are the AI's were raising. Those are our collective fucked up children.

−4