Submitted by timscarfe t3_yq06d5 in MachineLearning
red75prime t1_ivoiy3c wrote
Reply to comment by geneing in [D] What does it mean for an AI to understand? (Chinese Room Argument) - MLST Video by timscarfe
- Sure. Take an ML translator algorithm and the weights and do all matrix multiplications and other operations by hand.
Merastius t1_ivp2wc2 wrote
The part of the thought experiment that is deceptive is that, the simpler you make the 'rule following' sound, the more unbelievable (intuitively) it'll be that the room has any 'understanding', whatever that means to you. If instead you say that the person inside the room has to follow billions or trillions of instructions that are dynamic and depend on the state of the system and what happened before etc (i.e. modelling the physics going on in our brain by hand, or modelling something like our modern LLMs by hand), then it's not as intuitively obvious that understanding isn't happening.
Viewing a single comment thread. View all comments