I feel like LLMs have encoded sort of law of a languages in their latent space through texts and responding accordingly, anything that follows a law isnt called concious for e.g inaminate objects follow law of physics,but that doesnt mean that it indicates an intelligent behvaiour.
After all texts are medium to represent our thoughts, its the thoughts that matter not the medium.
The concept of causality , fundamental reality , and dcesion making is much more than following laws of languages which are just a means.
These LLMs cant question you until you ask them explicitly,they cant interject you , knowledge was never consciousness ,its these abilities that compose consciousness
I dont know how much sense i make to others or maybe i am at loss of good words,in a nutshell any model that fundamentally predicts tokens based of weightage of previous tokens can never achieve consciousness. We
12max345 t1_jdx4sdm wrote
Reply to [D] GPT4 and coding problems by enryu42
I feel like LLMs have encoded sort of law of a languages in their latent space through texts and responding accordingly, anything that follows a law isnt called concious for e.g inaminate objects follow law of physics,but that doesnt mean that it indicates an intelligent behvaiour.
After all texts are medium to represent our thoughts, its the thoughts that matter not the medium.
The concept of causality , fundamental reality , and dcesion making is much more than following laws of languages which are just a means.
These LLMs cant question you until you ask them explicitly,they cant interject you , knowledge was never consciousness ,its these abilities that compose consciousness
I dont know how much sense i make to others or maybe i am at loss of good words,in a nutshell any model that fundamentally predicts tokens based of weightage of previous tokens can never achieve consciousness. We