Submitted by blabboy t3_11ffg1u in MachineLearning
rpnewc t1_jajt66i wrote
Clearly it's computation of some form that's going on in our brain too. So sentience need to be better defined on where it would fall on the spectrum, with a simple calculator on one end and human brain on the other. My personal take is, it is much farther close to human brain than LLMs. Even if we build a perfectly reasoning machine which solves generic problems like humans do, I still wouldn't consider it human-like until it raises purely irrational emotions like, "why am I not getting any girl friends, or what's wrong with me?" There is no reason for anyone to build that into any machine. Most of the humanness lies in the non-brilliant part of the brain.
What-Fries-Beneath t1_jak1uri wrote
Emotion isn't necessary for consciousness. It's necessary for humanness.
Nearly everyone ITT is holding humans up as the standard. I think it's because we're all afraid to really consider that we're fancy meat robots.
rpnewc t1_jak3n45 wrote
How would you define consciousness then? Just self reflection?
What-Fries-Beneath t1_jak53gl wrote
I'm not a researcher in the space, just a big fan. That there are levels of consciousness is very well evidenced. Essentially each level is a layer of dynamic awareness. One of those layers is an awareness of self, and self in the world. It's the HOW that's under investigation not so much the "what". https://www.cambridge.org/core/journals/behavioral-and-brain-sciences/article/homing-in-on-consciousness-in-the-nervous-system-an-actionbased-synthesis/2483CA8F40A087A0A7AAABD40E0D89B2
People like to muddy the question with philosophy and spirituality.
Viewing a single comment thread. View all comments