Submitted by phloydde t3_1125s79 in singularity
expelten t1_j8jkwht wrote
Reply to comment by Some-Box-8768 in Speaking with the Dead by phloydde
AI won't be like humans unless we force them to act this way...think of it more like a sort of alien intelligence. We could create a superintelligence that would be relentless in achieving the most stupid and dullest task for example. In my opinion there isn't such a thing as pure free will, we act this way because nature made us this way. The same goes for AI, if they act a certain way it's because we made them like that. A good preview of this future is character.ai.
phloydde OP t1_j8l7l23 wrote
There is a great sci-fi series Psion where the protagonist sees the corporate AIs as a totally distinct “life form” whose recognition is beyond comprehension.
Some-Box-8768 t1_j9fy5pu wrote
I tend toward thinking a 'true' AI will evolve away from our control. Otherwise, it's just a cleverly coded algorithm that's called an AI because most people don't understand it or because it can pull off one or two surprising things. That's equivalent to a well trained pet, but still not as truly intelligent as a pet.
Humans might not be smart enough to identify true intelligence. We can't even identify intelligence in living creatures. Think of our long history of, "Only humans are intelligent because only we use tools." "OK. Well. Only humans make tools." "Um, well, only humans ...." "Well, only a human can pass The Turing Test!" So, maybe that test isn't as sufficient as people once thought it was.
Reminds me of that video where one monkey gets electrocuted by a live train wire, and another monkey gives him the equivalent of monkey CPR and brings the first one back to life! Or, maybe, he's taking advantage of a moment when he can beat a rival with no risk of immediate repercussions to himself. Either way, pretty darned smart.
Viewing a single comment thread. View all comments