Submitted by AUFunmacy t3_10pwt44 in philosophy
AUFunmacy OP t1_j6nss96 wrote
Reply to comment by KishCom in The Conscious AI Conundrum: Exploring the Possibility of Artificial Self-Awareness by AUFunmacy
I understand the response as I have experience in programming neural networks. You mean that just because the AI that we have run on software and might perceptually represent a similar model to neuronal activity. But physically, on the hardware level and on the compiling level it is very different. However, in essense, still represents steps of thought that navigate toward a likely solution - which is exactly what our brains do in that sense.
I don't mean to say that AI will gain consciousness and suddenly be able to deviate from its programming, but somehow just maybe, the complex neuronal activity conjures a subjective experience. It can only be explained by understanding that when looking at a single celled organism with no organs or possible mechanism of consciousness 3.8 billion years ago it is easy to say that thing cant develop consciousness; and as you evolve this single cell into multi-cellular organisms it still seems impossible until you see a sign of intelligent thought and you think to yourself "when the hell did that begin?" No one knows the nature of consciousness, we have to stop pretending we do.
Let it be known I think a submarine would win the olympics for swimming, and I also think you are naive to consider your consciousness anything more than a language model with some inbuilt sensory features.
KishCom t1_j6nuddm wrote
> I have experience in programming neural networks
Me too!
> just maybe, the complex brain activity conjures a subjective experience
That would be lovely. Conway's Game of Life, "simple rules, give rise to complexity" and all that. I don't think there's enough flexibility in current hardware that executes GNNs to allow this though. The kind of deviation required would be seen as various kinds of errors or problems with the model.
> I think a submarine would win the olympics for swimming
This is something a language model would come up with as it makes about as much sense as inquiring about the marital status of the number 5.
> I also think you are naive to consider your consciousness anything more than a language model with some inbuilt sensory features.
I think you should meditate more, perhaps try some LSD. What Is It Like to Be a Bat anyway?
edit BTW: I hope I don't come off as arguing. I'd love to have a cup of coffee and a chat with you.
Viewing a single comment thread. View all comments