Submitted by enryu42 t3_122ppu0 in MachineLearning
pengo t1_jdt6iv2 wrote
Reply to comment by cegras in [D] GPT4 and coding problems by enryu42
> Then what you have is something that can separate content into logically similar, but orthogonal realizations.
Like a word vector? The thing every language model is based on?
cegras t1_jdta9mj wrote
More like, the ability to know that 'reversing a linked list' and 'linked list cycle and traversal problems' are the same concepts but different problems, and to separate those into train/test. Clearly they haven't figured that out because ChatGPT is contaminated, and their (opaquely disclosed) ways of addressing that issue don't seem adequate at all.
Viewing a single comment thread. View all comments