Submitted by blabboy t3_11ffg1u in MachineLearning
currentscurrents t1_janr9qo wrote
Reply to comment by lifesthateasy in [D] Blake Lemoine: I Worked on Google's AI. My Fears Are Coming True. by blabboy
>"Sentience is the capacity to experience feelings and sensations". Scientists use this to study sentience in animals for example (not in rocks, because THEY HAVE NONE).
How do you know whether or not something experiences feelings and sensations? These are internal experiences. I can build a neural network that reacts to damage as if it is in pain, and with today's technology it could be extremely convincing. Or a locked-in human might experience sensations, even though we wouldn't be able to tell from the outside.
Your metastudy backs me up. Nobody's actually studying animal sentience (because it is impossible to study); all the studies are about proxies like pain response or intelligence and they simply assume these are indicators of sentience.
>What we found surprised us; very little is actually being explored. A lot of these traits and emotions are in fact already being accepted and utilised in the scientific literature. Indeed, 99.34% of the studies we recorded assumed these sentience related keywords in a number of species.
Here's some reading for you:
https://en.wikipedia.org/wiki/Hard_problem_of_consciousness
https://en.wikipedia.org/wiki/Mind%E2%80%93body_problem
People much much smarter than either of us have been flinging themselves at this problem for a very long time with no progress, or even no ideas of how progress might be made.
lifesthateasy t1_janudsp wrote
So you want to debate my comment in sentience, so you prove this by linking a wiki article about consciousness?
Ah, I see you haven't gotten past the abstract. Let me point you to some of the more interesting points: "Despite being subject to debate, descriptions of animal sentience, albeit in various forms, exist throughout the scientific literature. In fact, many experiments rely upon their animal subjects being sentient. Analgesia studies for example, require animal models to feel pain, and animal models of schizophrenia are tested for a range of emotions such as fear and anxiety. Furthermore, there is a wealth of scientific studies, laws and policies which look to minimise suffering in the very animals whose sentience is so often questioned."
So your base idea of questioning sentience just because it's subjective is a paradox that can be resolved by one of two ways. Either you accept sentience and continue studying it, or you say it can't be proven and then you can throw psychology out the window, too. By your logic, you can't prove to me you exist, and if you can't even prove such a thing, why even do science at all? We don't assume pain etc. are proxies to sentience, we have a definition for sentience that we made up to describe this phenomenon we all experience. "You can't prove something that we all feel and thus made up a name for it because we can only feel it" kinda makes no sense. We even have specific criteria for it: https://www.animal-ethics.org/criteria-for-recognizing-sentience/
Viewing a single comment thread. View all comments