Submitted by jaqws t3_10dljs6 in MachineLearning
yahma t1_j4owot0 wrote
Reply to comment by MegavirusOfDoom in [D] Fine-tuning open source models on specific tasks to compete with ChatGPT? by jaqws
This may be the size of the datasets, but i it's hard to say how many parameters will be needed for a good llm that's just really good at explaining code.
MegavirusOfDoom t1_j4pfdi1 wrote
Then we'd have to crawl all of stack exchange, all of wiki, and 1 terabyte of programming books... This "generalist NLP" is for article writing, for poetry.
I'm a big fan of teaching ChatGPT how to interpret graphs, the origin lines, to record in a vector engine that is couple with the NLP. For a coding engine, I believe NLP should be paired with a compiler, just like a maths specialized NLP should also have a mathlab type engine.
Viewing a single comment thread. View all comments