eigenlaplace t1_isa6z73 wrote
Reply to comment by Co0k1eGal3xy in [R] Mind's Eye: Grounded Language Model Reasoning through Simulation - Google Research 2022 by Singularian2501
It’s a simple question, no mention of air anywhere… The correct answer is they fall at the same rate.
Co0k1eGal3xy t1_isa7e7b wrote
I live on the planet earth where most places have air. It is assumed that there is air if it is not mentioned otherwise.
eigenlaplace t1_isa7xzy wrote
I live on planet Question where most places have no air. Where is your god now?
Co0k1eGal3xy t1_isa8g7i wrote
>current language models (LMs) miss the grounded experience of humans in the real-world -- their failure to relate language to the physical world causes knowledge to be misrepresented and obvious mistakes in their reasoning.
That is my whole point. This paper trying to avoid "planet Question" and make language models work in the real world instead.
I'm not interested in arguing over this. The paper is good, it just needs a minor correction in a future revision.
AskMoreQuestionsOk t1_isb66lb wrote
Actually, I think you make a good point. If you think about understanding conversations and stories and problems like this, you need a model understanding of what it is that you are talking about to even begin to make an accurate assumption about what the prediction of the next state will be. - we make an incredible number of assumptions from our own experience when we make those internal models. How do we know if air friction is important to this problem?
Viewing a single comment thread. View all comments