Submitted by Akimbo333 t3_10easqx in singularity
JavaMochaNeuroCam t1_j64e3yg wrote
Reply to comment by green_meklar in What do you guys think of this concept- Integrated AI: High Level Brain? by Akimbo333
Alan was really psyched about GATO (600+ tasks/domains)
I think it's relatively straightforward to bind experts to a general cognitive model.
Basically, the MOE, Mixture of Experts, would dual train the domain-specific model with simultaneous training of the cortex (language) model. That is, a pre-trained image-recognition model can describe an image (ie, a cat) in text to an LLM, but also bind it to a vector that represents the neural state that captures that representation.
So, you're just binding the language to the domain-specific representations.
Somehow, the hippocampus, thalamus and claustrum are involved in that in humans. If I'm not mistaken.
Viewing a single comment thread. View all comments