Submitted by CS-fan-101 t3_11xskuk in MachineLearning
brownmamba94 t1_jd6zyd5 wrote
Reply to comment by Carrasco_Santo in [R] SPDF - Sparse Pre-training and Dense Fine-tuning for Large Language Models by CS-fan-101
I totally agree, and really wonder how the landscape will look in 10 years when it comes to ML model architectures, training strategies, optimization techniques, etc...it'll be very interesting.Although plasticity-based learning, spiking neural networks, and other neuromorphic algorithms that use local learning rules don't get the same kind of attention as gradient based learning, I do believe mimicing the neural activity of the brain through emulating spiking neural networks could potentially one day be a good solution for inference (in terms of cost and power efficiency). Though, currently, implementing spike-based learning and training has still proven to be a challenge. But hey, one thing is common is that sparsity is a key enabler for these types of hardware.
Carrasco_Santo t1_jd8e34b wrote
Imagine the situation where after so many studies, some international team manages to optimize the functioning of artificial neurons to a point where they are more efficient than biological neurons? We would automatically be outclassed.
And this is possible, scientists around the world have studied ways to optimize natural processes for some purpose, for example, ways to reduce the number of necessary steps that photosynthesis needs to produce sugar, making the process faster and more economical, it may be that the same happens with the functioning of neurons and their capacities.
brownmamba94 t1_jdawqp9 wrote
That's a pretty interesting thought...reminds me of this research from MIT that came out last summer. hmm...how computationally complex is a single neuron? Work like this can potentially help advance the field of analog deep learning. I think sparsity will play a role here in both at the connection-level and neuron-level, potentially further reducing energy consumption and allowing for better resource utilization.
Viewing a single comment thread. View all comments