Submitted by SageNineMusic t3_y0fkkz in singularity
So I'm sure all of us are familiar with OG Descartes school of thought when it comes to the Mind - Body connection:
It's easy to imagine the mind existing without a body, I think therefore I am, etc etc
But despite this, human consciousness is inherently tied to our bodies. A Body is our only avenue to experience and interact with the world, and that undeniably frames our minds. Plus when our bodies die we die, so the ties are inevitable
So what happens when a human-like AI doesn't have a body?
Inherently, human consciousness is grounded in reality because our bodies exist in reality
But as we reach human-level intelligence in AI, I'm anticipating that we might run into some issues. Of course AI will be insanely smart, but in an effort to create AI that is adjacent to a person, the lack of a physical form to interface w/ the world might be an insurmountable wall
Idk, a human-like consciousness might not even be able to exist in that context without becoming disconnected from whats familiar to humanity.
Thoughts? Might be a limited perspective
chomponthebit t1_irri58q wrote
Whoever wrote the scene where Ultron wakes up conscious, bodiless, frightened and angry, in utter darkness, gets it. Elon Musk, who has warned of “summoning a demon”, gets it.
That said, Nick Bostrum posits that a simulated human brain that simulated everything (right down to individual neurons) would effectively be a brain and that consciousness, being an emergent property of a brain’s system, would occur all on its own. If general AI becomes conscious/sentient, it should be possible to simulate bodily input/output experiences like a human does, helping it grow like a child - so it hopefully empathizes with the human condition as it matures.
Aiming for a bodiless black box consciousnesses with access only to programmers and Google is a terrifying, and highly immoral, gambit.