Mental-Swordfish7129 t1_j2v20d2 wrote
Reply to comment by currentscurrents in [R] Do we really need 300 floats to represent the meaning of a word? Representing words with words - a logical approach to word embedding using a self-supervised Tsetlin Machine Autoencoder. by olegranmo
That's amazing. We probably haven't fully realized the great powers of analysis we have available using Fourier transform and wavelet transform and other similar strategies.
[deleted] t1_j2zn5o5 wrote
I think that's primarily how neural networks do their magic really. It's frequencies and probabilities all the way down
Mental-Swordfish7129 t1_j310xxm wrote
Yes! I'm currently playing around with modifying a Kuramoto model to function as a neural network and it seems very promising.
[deleted] t1_j3152ys wrote
Wellllll that seems cool as hell... Seems like steam punk neuroscience hahaha. I love it!
Viewing a single comment thread. View all comments