Viewing a single comment thread. View all comments

AutoMeta OP t1_irwpxr3 wrote

Wow! Thanks for the great answer. I loved the "subversion or confirmation of expectation". I do think computers can be emotional but by opposing a more emotional program externally (from the root) to a more rational one, they should arrive to different conclusions and be required to reach consensus. So Love, being differently structured than Reason, should surprise Reason for instance, defending humans and finding the endearing. Is that possible?

1

AsheyDS t1_irxltvz wrote

Something like that perhaps. In the end, we'll want an AGI that is programmed specifically to act and interact in the ways we find desirable. So we'll have to at least create the scaffolding for emotion to grow into. But it's all just for human interaction, because it itself won't care much about anything at all unless we tell it to, since it's a machine and not a living organism that already comes with it's own genetic pre-programming. Our best bet to get emotion right is to find that balance ourselves and then define a range for it to act within. So it won't need convincing to care about us, we can create those behaviors ourselves, either directly in the code or by programming through interaction.

1