Submitted by [deleted] t3_10xoxh6 in Futurology
Zer0pede t1_j7vrwvc wrote
Reply to comment by HistoricalCommon in Can't we just control the development of AI? by [deleted]
Only thing is human values are pretty arbitrary, so there’s no reason a rational AI would have them.
Humans want to save whales because we think they look cool. It’s mostly our pareidolia and projection that makes us like cute animals, and also trees and sunlight.
An AI wouldn’t need any of that—it could just as easily decide to incorporate every squirrel on earth into its giant auto-spellchecker.
Viewing a single comment thread. View all comments