Viewing a single comment thread. View all comments

Carrasco_Santo t1_jd4wigs wrote

I like to see all these advances optimizing machine learning more and more. In 10 years (being pessimistic) it will be very interesting, and I sincerely hope that neuromorphic processors leave the laboratory and become real, this would advance the area even further.

7

brownmamba94 t1_jd6zyd5 wrote

I totally agree, and really wonder how the landscape will look in 10 years when it comes to ML model architectures, training strategies, optimization techniques, etc...it'll be very interesting.Although plasticity-based learning, spiking neural networks, and other neuromorphic algorithms that use local learning rules don't get the same kind of attention as gradient based learning, I do believe mimicing the neural activity of the brain through emulating spiking neural networks could potentially one day be a good solution for inference (in terms of cost and power efficiency). Though, currently, implementing spike-based learning and training has still proven to be a challenge. But hey, one thing is common is that sparsity is a key enabler for these types of hardware.

3

Carrasco_Santo t1_jd8e34b wrote

Imagine the situation where after so many studies, some international team manages to optimize the functioning of artificial neurons to a point where they are more efficient than biological neurons? We would automatically be outclassed.

And this is possible, scientists around the world have studied ways to optimize natural processes for some purpose, for example, ways to reduce the number of necessary steps that photosynthesis needs to produce sugar, making the process faster and more economical, it may be that the same happens with the functioning of neurons and their capacities.

2

brownmamba94 t1_jdawqp9 wrote

That's a pretty interesting thought...reminds me of this research from MIT that came out last summer. hmm...how computationally complex is a single neuron? Work like this can potentially help advance the field of analog deep learning. I think sparsity will play a role here in both at the connection-level and neuron-level, potentially further reducing energy consumption and allowing for better resource utilization.

1