Submitted by [deleted] t3_106ixgv in singularity
KSRandom195 t1_j3gvo8c wrote
ChatGPT doesn’t understand how code works.
It can’t actually solve problems, only answer prompts with solutions it’s already seen before.
NikoKun t1_j3hm4ym wrote
There does appear to be some level of understanding and problem-solving, emerging as more than the sum of it's knowledge, & that goes well beyond merely answering with solutions it's already seen. I can assure you, I've asked it to help me with some very obscure coding problem-solving, that I'd been stuck on for a while, and I think thanks to it's short-term memory, it figured out a solution I never would have. All it took was a little back and forth to give it enough context, and it worked out a solution that really couldn't exist anywhere else.
KSRandom195 t1_j3hspcp wrote
Appear but not actually. It is a LLM, which has no understanding of its content.
Unless you think it somehow spontaneously developed consciousness, it’s not quite conscious yet.
blueSGL t1_j3hizep wrote
>only answer prompts with solutions it’s already seen before.
citation needed.
KSRandom195 t1_j3hssmn wrote
This is fundamental to how LLMs work. They don’t generate new knowledge.
blueSGL t1_j3ibqqs wrote
well if that's the case plagiarism detectors should have no problem identifying the output then.
or maybe it's the case that by being trained on so much data the underlying structure about how data should be formed happens.
It would explain the emergent abilities.
KSRandom195 t1_j3iikl6 wrote
Or people are grasping at straws to try to explain a mechanism they don’t understand.
blueSGL t1_j3ijeyf wrote
>It can’t actually solve problems, only answer prompts with solutions it’s already seen before.
.
>people are grasping at straws to try to explain a mechanism they don’t understand.
You are making definitive statements about things you say that experts in the field 'don't understand'
either you are claiming you know more than them or you are professing your ignorance of the matter.
Which is it.
KSRandom195 t1_j3ip49m wrote
Experts in the field aren’t claiming it’s generating new knowledge. They’re saying as you extend the size of the model interesting stuff happens. Roughly it seems they’re saying it performs better.
blueSGL t1_j3ir8t3 wrote
read the paper, it's not that it performs better, it's that abilities that are as good as random suddenly hit a phase change and become measurably better.
you were initially saying
> only answer prompts with solutions it’s already seen before.
Lets look at an example that makes things crystal clear.
Image generators by combining concepts can come up with brand new images. Does it have to have seen dogs before in order to place one in the image? yes. does it need to have seen one that looks identical to the final dog. e.g. could you crop the image and reverse image search it and get a match. No.
The same is true with poems, summations, code, etc... it's finding patterns and creating outputs that match the requested pattern so to get back to the point of coding it could very well output code it's never seen before by ingesting enough to understand syntax.
It's seen dogs before. it outputs similar but unique dogs. It's seen code before. It outputs similar but unique code.
KSRandom195 t1_j3iw5xq wrote
That’s not generating new knowledge.
You’re not going to use this to generate new solutions in software that don’t already exist.
blueSGL t1_j3iwwp2 wrote
There's lots of things I've coded in software that have not existed before and are merely recombining structures that already exist to tackle new problems. It's why programing languages exist.
That is a 'new solution' to me. What do you mean when you say it?
KSRandom195 t1_j3ix850 wrote
New to you, not new to the industry.
blueSGL t1_j3izpcc wrote
again what do you mean by that, people code new software every day.
You can ask for poems that don't exist, essays that don't exist.
All these things have had their structure extracted understood then followed to create new items.
Asking for code is the same.
>Will ChatGPT be able to write better code than any human within the next year?
A good coder needs to eat and sleep and take time to understand new technology has a limited scope in the programming languages known, has good days and bad days, has 'blocks' and is a single unit able to process problems serially at human level speed.
Viewing a single comment thread. View all comments