CireNeikual
CireNeikual t1_j3cmwtt wrote
My own work focuses on an alternative to deep learning, called Sparse Predictive Hierarchies (SPH). It is implemented in a library called AOgmaNeo (Python bindings also exist). It does not use backpropagation, runs fully online/incremental/continually (non-i.i.d.). Its main advantages are the online learning but also that it runs super fast. Recently, I was able to play Atari Pong (with learning enabled!) on a Teensy 4.1 microcontroller, and still get 60hz.
If you would like to know more about it, here is a link to a presentation I gave a while back (Google Drive).
Other than my own work, I find the Tsetlin Machine interesting as well.
CireNeikual t1_iw5fuqx wrote
Reply to comment by genesis05 in [Project] Erlang based framework to replace backprop using predictive coding by abhitopia
Predictive coding is a good place to start, but I think it's also important to embrace sparsity to permit computationally efficient fully online/incremental learning. As is, predictive coding is mostly just used as a drop-in replacement for backpropagation, without really providing too many additional advantages. Predictive coding by itself doesn't permit online learning.
CireNeikual t1_ja93iae wrote
Reply to comment by Jonas_SV in [P] Basic autodiff library for scalar values in C by JanBitesTheDust
I would actually recommend Cython over C types, it's nicer especially when it comes to handling numpy arrays.