Submitted by Vegetable-Skill-9700 t3_121a8p4 in MachineLearning
EmmyNoetherRing t1_jdma3em wrote
Reply to comment by Crystal-Ammunition in [D] Do we really need 100B+ parameters in a large language model? by Vegetable-Skill-9700
Introspection? Cog-sci/classical AI like to use the term, not always in the best justified fashion I think. But when you’re hallucinating your own new training data it seems relevant.
Viewing a single comment thread. View all comments