CireNeikual

CireNeikual t1_j3cmwtt wrote

My own work focuses on an alternative to deep learning, called Sparse Predictive Hierarchies (SPH). It is implemented in a library called AOgmaNeo (Python bindings also exist). It does not use backpropagation, runs fully online/incremental/continually (non-i.i.d.). Its main advantages are the online learning but also that it runs super fast. Recently, I was able to play Atari Pong (with learning enabled!) on a Teensy 4.1 microcontroller, and still get 60hz.

If you would like to know more about it, here is a link to a presentation I gave a while back (Google Drive).

Other than my own work, I find the Tsetlin Machine interesting as well.

30

CireNeikual t1_iw5fuqx wrote

Predictive coding is a good place to start, but I think it's also important to embrace sparsity to permit computationally efficient fully online/incremental learning. As is, predictive coding is mostly just used as a drop-in replacement for backpropagation, without really providing too many additional advantages. Predictive coding by itself doesn't permit online learning.

5