neuralbeans t1_jdmarno wrote
Reply to comment by OriginalCompetitive in What happens if it turns out that being human is not that difficult to duplicate in a machine? What if we're just ... well ... copyable? by RamaSchneider
What does that mean?
OriginalCompetitive t1_jdmnbh6 wrote
You said you couldn’t think of any reason why we would be different than a complex computer. One possible reason is that we’re conscious and it’s possible complex computers will not be.
We don’t know what causes consciousness, but there’s no reason to think intelligence has anything to do with consciousness.
neuralbeans t1_jdmo56v wrote
No I mean what is consciousness?
OriginalCompetitive t1_jdnc1dl wrote
The fact that you ask me this makes me suspect that maybe you aren’t conscious.
neuralbeans t1_jdnscjk wrote
Would you be able to tell if I didn't?
OriginalCompetitive t1_jdnysrr wrote
I’ll save you some time. I can’t define it, I can’t test for it, I can’t even be sure that I was conscious in the past or if I’m simply inserting a false memory of having been conscious in the past when I actually wasn’t.
I feel like I can be sure that I’m conscious at this precise moment, though, and I think it’s a reasonable guess that I was conscious yesterday as well, and probably a reasonable guess that most other people experience some sort of conscious experience. For that reason I try not to impose needless suffering on other people even though I can’t be sure that they truly experience conscious suffering.
I think it’s possible that complex computers will never experience consciousness, and if I’m right, that would be a reason why we would be different than a complex computer.
Viewing a single comment thread. View all comments