Submitted by hackinthebochs t3_zhysa4 in philosophy
Opus-the-Penguin t1_izorju2 wrote
Interesting. How would we test for it? Is there a way of telling the difference between something that has sentience and something that mimics it?
hackinthebochs OP t1_izp7r6g wrote
We can always imagine the behavioral/functional phenomena occurring without any corresponding phenomenal consciousness. So this question can never be settled by experiment. But we can develop a theory of consciousness and observe how well the system in question corresponds to the features our theory says correspond with consciousness. Barring any specific theory, we can ask in what ways are the system similar and different from systems we know that are conscious and whether the similarities or differences bear on the credibility of attributing conscious to the system.
Theory is all well and good, but in the end it will have little practical significance. People tend to be quick to attribute intention or minds to inanimate or random occurrences. Eventually the behavior of these systems will be so similar to humans that most people's sentience-attribution machinery will fire and we'll be forced to confront all the moral questions we have been putting off.
Opus-the-Penguin t1_izp9qo8 wrote
Nice succinct statement of the issue. There's a lot boiled down into those two paragraphs. Thank you.
electriceeeeeeeeeel t1_izw17ex wrote
Nope no difference, that's why we will come to accept them at face value as the same but many will hold the underlying value assumption that it is different because its parts are different. Still, others won't lean on that value assumption, and due to the lack of strong evidence the belief that they aren't sentient will likely erode over time.
Viewing a single comment thread. View all comments