I remember reading about this type of concept, and iirc it does seem that there is quite a bit of local learning in biological neural networks. But global convergence of the model seems like a challenge with this type of scheme. Maybe there's some way to incorporate a periodic global backprop to address that? Has anyone tried it? Or maybe you don't even need it and the problem will disappear with enough scale
theoryanddata t1_j69dx28 wrote
Reply to [D] Could forward-forward learning enable training large models with distributed computing? by currentscurrents
I remember reading about this type of concept, and iirc it does seem that there is quite a bit of local learning in biological neural networks. But global convergence of the model seems like a challenge with this type of scheme. Maybe there's some way to incorporate a periodic global backprop to address that? Has anyone tried it? Or maybe you don't even need it and the problem will disappear with enough scale