Viewing a single comment thread. View all comments

CertainMiddle2382 t1_j6wup8c wrote

Context.

Hard physical problems happen in a very controlled context, that context is often a “fiction” of reality deemed close enough but simple enough to be useful.

Even all “common” mathematics had to be declared to happen inside a red taped safe space named ZFC, otherwise the unrelenting waves of complexity outside of it would have torn down everything we could be trying to build.

Everything is about context.

“Perception”, “real life” happens in a much more complicated context. That context is not sandboxed and contains all the all little sandboxes we built to make our thinking work.

To model those simple concepts , you practically need to have a internalized model of the whole world…

4

Iffykindofguy t1_j6x0cre wrote

No, you do not need to have an internalized model of the whole world, that's an 80s solution to 80s sci fi paradoxes

4

CertainMiddle2382 t1_j6xh585 wrote

I don’t get it. DNN latent space is an internalized model of the world, a mapping of its invariants in increasing levels of abstraction.

It is just not called that way…

3

Iffykindofguy t1_j6xhjgw wrote

A machine build to operate on a construction site does not need an internalized model of the whole world, especially if that machine is able to access new information streams constantly.

2

CertainMiddle2382 t1_j6xkn6n wrote

A machine? An AI is not a simple machine, it is a machine that strives to have a model of the world and act according to it. It is not a simple excavator.

2

Iffykindofguy t1_j6xqkgq wrote

Humans are machines. AI's physical body will be a machine. Youre getting lost in the details. My point was simply that a lot of blue collar replacements you seem to think would need entire world simulations for don't require that.

2