Viewing a single comment thread. View all comments

VanceIX t1_irvjecd wrote

I actually believe that empathy is the root of all the good that humans stand for. Almost all of our positive impacts on the world and to each other stem from empathy, which is a very humanistic trait. If we can instill any one human concept in AI going forward, empathy would be one hell of a start.

I truly believe that if we create a general agent with the concept of empathy at its core we’ve gone most of the way towards solving alignment.

22

AutoMeta OP t1_irvoil9 wrote

I think the concept of empathy is not that hard to implement actually, an advanced AI should be able to understand and predict how a human might feel in a given situation. What to do with that knowledge could depend on wether or not you care or love that given person.

8

AdditionalPizza t1_irwxipm wrote

>What to do with that knowledge could depend on w[h]ether or not you care or love that given person.

Do you have more empathy for the people you love, or do you love the people you have more empathy for?

If I had to debate this I would choose the latter, as empathy can be defined. Perhaps love is just the amount of empathy you have toward another. You cannot love someone you don't have empathy for but you can have empathy for someone you don't love.

Would we program an AI to have more empathy toward certain people, or equally for all people? I guess it depends on how the AI is implemented, whether it's individual bots roaming around, or if it's one singular AI living in a cloud.

2