Submitted by Foundation12a t3_zcyo7v in singularity
PolymorphismPrince t1_iz1w31t wrote
Reply to comment by hducug in What are your predictions for 2023? How did your predictions for 2022 turn out? by Foundation12a
I actually don't think you understand large language models very well, the human brain is almost structurally isomorphic to a stimulus prediction model if you think about it. And basically every stimulus can be encoded in text.
hducug t1_iz22ggk wrote
The human brain has something called logic, which the language models don’t have. Logic is literally what intelligence is all about. It doesn’t matter that prediction models work the same as our brain, it has nothing to do with gpt-4 being intelligent.
Viewing a single comment thread. View all comments