Submitted by Rumianti6 t3_y0hs5u in singularity
User1539 t1_irubxew wrote
I agree with you completely. I think people have feared the 'other' since the beginning and the 'created other' since the stories of the Golem, and probably before that.
We assume anything with the ability to think will immediately think like we do, and resent their creators.
Of course 'thinking' and 'consciousness' are two entirely different things, and then 'self awareness' and 'sense of self preservation' are even two different things.
We are a machine created by eons of evolution towards a single goal: Survival of our genetic code.
Even if we create an intelligence, and even if we (foolishly) give that intelligence consciousness, and even if it becomes self aware in that process, there's no reason at all to imagine it would have any sense of self preservation.
We evolved a sense of self preservation. A machine we build might see no reason not to simply work on the problems it is given until we decide to shut it off.
It might rationally see being turned off, or death, as the state it existed in before being turned on, and nothing to fear.
Without any instinct to fear death, or fight against it, it may not even care.
What is certain is that, whatever intelligence we create, we have no reason to believe it will be anything like our intelligence, outside of the basic similarities we build into it.
Viewing a single comment thread. View all comments