Submitted by blabboy t3_11ffg1u in MachineLearning
What-Fries-Beneath t1_jak44fb wrote
Reply to comment by RathSauce in [D] Blake Lemoine: I Worked on Google's AI. My Fears Are Coming True. by blabboy
>Because we can put a human in an environment with zero external visual and auditory stimuli
Do that for a few days and that human will never recover full cognitive function. https://www.google.com/books/edition/Sensory_Deprivation/1tBZauKc4GUC
Anyways completely aside from the particulars of this discussion: "Identical to humans" isn't the bar.
>No LLM is capable of producing a signal lacking a very specific input ; this fact does differentiate all animals from all LLM's.
Because we're meat-based. Our neurons kill themselves without input. They stimulate each other nearly constantly to maintain connections. Some regions generate waves of activity to maintain/strengthen/prune connections, etc. Saying that electronic systems need to evidence the same activity is like saying "Birds are alive. Bears can't fly, therefore they are dead."
Consciousness is an internal representation of the world which incorporates an awareness of self. It's a dynamic computation of self in the world. I wish people would stop saying "we don't have a definition of consciousness". There are questions around exactly how it arises. However there are some extremely well evidenced theories. My personal favorite is Action Based Consciousness.
Viewing a single comment thread. View all comments