Viewing a single comment thread. View all comments

Barton5877 t1_jeadc3o wrote

On 2:

Competence is used sociologically to describe ability to perform, such as speak or act, in a manner demonstrating some level of mastery - but isn't necessarily a sign of understanding.

I'd be loathe to have to design a metric or assessment by which to "measure" understanding. One can measure or rate competence - the degree to which the person "understands" what they are doing, why, how, for what purpose and so on is another matter.

In linguistics, there's also a distinction between practical and discursive reason that can be applied here: ability to reason vs ability to describe the reasoning. Again, understanding escapes measurement, insofar as what we do and how we know what we are doing isn't the same as describing it (which requires both reflection on our actions and translation into speech that communicates them accurately).

The long and short of it being that "understanding" is never going to be the right term for us to use.

That said, there should be terminology for describing the conceptual connectedness that LLMs display. Some of this is in the models and design. Some of it is in our projection and psychological interpretation of their communication and actions.

I don't know to what degree LLMs have "latent" conceptual connectedness, or whether this is presented only in the response to prompts.

3

pengo t1_jechdk0 wrote

> The long and short of it being that "understanding" is never going to be the right term for us to use.

Yet still I'm going to say "Wow, ChatGPT really understands the nuances of regex xml parsing" and also say, "ChatGPT has no understanding at all of anything" and leave it to the listener to interpret each sentence correctly.

> I don't know to what degree LLMs have "latent" conceptual connectedness, or whether this is presented only in the response to prompts.

concept, n.

  1. An abstract and general idea; an abstraction.

  2. Understanding retained in the mind, from experience, reasoning and imagination

It's easy to avoid using "understanding" for being imprecise but it's impossible not to just pick other words which have the exact same problem.

1

Barton5877 t1_jee1fw4 wrote

That the definition of concept you're citing here uses the term "understanding" is incidental - clearly it's a definition of concept in the context of human reasoning.

Whatever terminology we use ultimately for the connectedness of neural networks pre-trained on language is fine by me. It should be as precise to the technology as possible whilst conveying effects of "intelligence" that are appropriate.

We're at the point now where GPT-4 seems to produce connections that come from a place that's difficult to find or reverse engineer - or perhaps which simply come from token selections that are surprising.

That's what I take away from a lot of the discussion at the moment - I have no personal insight into the model's design, or the many parts that are stitched together to make it work as it does (quoting Altman here talking to Lex).

1