Submitted by spiritus_dei t3_10tlh08 in MachineLearning
Myxomatosiss t1_j7abejl wrote
Reply to comment by ---AI--- in [D] Are large language models dangerous? by spiritus_dei
That's a fantastic question. ChatGPT is a replication of associative memory with an attention mechanism. That means it has associated strings with other strings based on a massive amount of experience. However, it doesn't contain a buffer that it works through. We have a working space in our heads where we can replay information, ChatGPT does not. In fact, when you pump in an input, it cycles through the associative calculations, comes to an instantaneous answer, and then ceases to function until another call is made. It doesn't consider the context of the problem because it has no context. Any context it has is inherited from its training set. To compare it with the Chinese room experiment, imagine if those reading the output of the Chinese room found it to have some affect. Maybe it has a dry sense of humor, or is a bit of an airhead. That affect would come exclusively from the data set, and not from some bias in the room. I really encourage you to read more about neuroscience if you'd like to learn more. There have been brilliant minds considering intelligence since long before we were born, and every ML accomplishment has been inspired by their work.
bjergerk1ng t1_j7ar6ps wrote
Hi ChatGPT
---AI--- t1_j7au2sj wrote
The Chinese room experiment is proof that a Chinese room can be sentient. There's no difference between a Chinese room and a human brain.
> It doesn't consider the context of the problem because it has no context.
I do not know what you mean here, so could you please give a specific example that you think ChatGPT and similar models will never be able to correctly answer.
Myxomatosiss t1_j7budz6 wrote
If you truly believe that, you haven't studied the human brain. Or any brain, for that matter. There is a massive divide.
Ask it for a joke.
But more importantly, it has no idea what a chair is. It has mapped the association of the word chair to other words, and it can connect them together in a convincingly meaningful way, but it only has a simple replication of associative memory. It's lacking so many other functions of a brain.
Viewing a single comment thread. View all comments