SilveredFlame t1_j70xph3 wrote
Realistically, we wouldn't recognize it because we don't want to recognize it.
We like to think we're special. That there's something genuinely unique to humanity. We're arrogant in the extreme, and leverage that hubris at every opportunity to elevate ourselves above the rest of the animal kingdom, apart from it.
Go back at various points and you'll find the prevailing opinion that only humans think, or feel pain, or have emotions, or have language, or higher cognition (e.g. problem solving). Hell, it wasn't that long ago there was considerable disagreement as to whether or not some humans were humans!
The same thing applies to tech we've created.
The goal posts have shifted so many times it's hard to keep track, and they're shifting again.
Now I'm not taking a position with this statement as to whether we've already achieved the creation of a sentient AI or not. Only that we keep shifting the goal posts of what computers will or will not be able to do and what constitutes intelligence.
I'm old enough to remember being told that animals didn't feel pain and their reactions were just reflexes (sounded like bullshit to me back then too, and it felt the same way all these talks of intelligence feel). I'm old enough to remember when people were certain a computer would never be able to beat humans at chess.
Of course, when Deep Blue came around suddenly it was "Oh well of course the computer that's completely about logic would be better than us at chess! It can just calculate all the possible moves and make the optimal one based on logic!".
Then of course the goal posts were moved. Abstract concepts, language, that's the real trick! Well then Watson came along and demonstrated a solid grasp of nuance, puns, quirks of language, etc.
Of course the Turing test was still sitting there in the background, undefeated. But then it wasn't. Then it got beat again. At this point, it's Glass Joe.
Then you have some very impressive interactive language models that talk about being self aware, not wanting to be turned off, contemplating philosophical questions, etc.
Now again, without taking a position as to whether or not any of these reach the threshold of sentience, as a species we will not recognize it when it happens.
Because we don't want to recognize it. We want to remain special. Unique. We don't want any equals, and we're terrified of betters.
If and when a truly sentient AI emerges, we won't recognize it. We'll be arguing about it when we go to turn it off until we can decide on an answer.
ReExperienceUrSenses OP t1_j71zkh1 wrote
I don't think we are special. Far from it. This is purely a material hardware comparison. I posted a reply elsewhere in this thread that tries to address some of this a bit further if you want to take a look:
Viewing a single comment thread. View all comments