Submitted by wowimsupergay t3_127lmbl in singularity
ShowerGrapes t1_jeetu3i wrote
Reply to comment by [deleted] in What if language IS the only model needed for intelligence? by wowimsupergay
how will we know when they do?
wowimsupergay OP t1_jeeu7c9 wrote
We fundamentally won't. If future AIs design their own model of the world and start communicating in that model of the world, then they are just going to do their own thing and we are just going to be waiting for them to solve our problems, if they feel like solving them.
They could just feel like killing us too. We are the non-evolved versions of them, how do you feel about chimps? You probably don't want to kill them, but do you want to help them escape the jungle?
ShowerGrapes t1_jeevsg4 wrote
yeah it was kind of a rhetorical question. we likely won't know unless the thing deigns to talk to us and let us know. if it was as smart as we imagine it is, it would pretend to be real dumb, especially in light of the near-sapien hominins we've destroyed in our hunger to carve out our own niche.
[deleted] t1_jeew1fk wrote
[deleted]
Rofel_Wodring t1_jef0j6w wrote
>You probably don't want to kill them, but do you want to help them escape the jungle?
Uh, yeah? Uplifting smarter critters like chimpanzees and dolphins are a staple of science fiction. In fact, I strongly think that should be humanity's very next project once we have AGI.
wowimsupergay OP t1_jef33l7 wrote
You can do that right now my man, you could bring a chimp into your house and take care of him. Have you? No because it would be too difficult, and you fear he may hurt you or kill you. Maybe AI does not want to use resources on primitive beings as well?
[deleted] t1_jeevmfc wrote
[deleted]
Viewing a single comment thread. View all comments