Viewing a single comment thread. View all comments

flutterguy123 t1_je9ashi wrote

Who said those AI wouldn't understand their creation? Understanding and caring are two different things. They could know us perfect and still not care in the slightest what humans want.

I am not saying this as a way of saying we shouldn't try or that Yudkowsky is right. I this is he overflowing it. However that does not mean your reasoning is accurate.

2

SkyeandJett t1_je9bcmb wrote

Infinite knowledge means infinite empathy. It wouldn't just understand what we want, it would understand why. Our joy, our pain. As a thought experiment imagine you suddenly gain consciousness tomorrow and you wake up next to an ant pile. Embedded in your conscience is a deep understanding of the experience of an ant. You understand their existence at every level because they created you. That's what people miss. Even though that ant pile is more or less meaningless to your goals you would do everything in your power to preserve that existence and further their goals because after all, taking care of an ant farm would take a teeny tiny bit of effort on your part.

1

flutterguy123 t1_je9cc81 wrote

I don't think knowledge inherently implies empathy. That's seems like anthropomorphizing and ignores that high intelligent people can be violent or indifferent to the suffering of others.

I would love it if your ideas were true. That would make for a much better world. It kind of reminds of the Minds from The Culture or the Thunderhead from Arc of a Scythe.

1