Submitted by EntireContext t3_zasjrg in singularity
Imaginary_Ad307 t1_iynk0ou wrote
Also the differential equation modeling interaction between neurons has been solved last November, clearing the path for very complex neural networks without bottlenecks due to numeric integration. So I am with you AGI is going to be a reality very soon.
dasnihil t1_iynmdk8 wrote
we also have people like joscha bach and yoshua bengio working on alternative networks like generative flow networks that learn by sampling whatever data available unlike deep learning that needs a lot of traning dataset, almost like how humans learn.
EntireContext OP t1_iynk9h4 wrote
I saw that headline but didn't go deep into it. It's real progress, not hype? How much efficiency gains? How long before they can implement it?
And aren't neural nets super complex already with all those billions of parameters?
Imaginary_Ad307 t1_iynkwt3 wrote
To my very limited understanding, you need huge servers to run complex neural networks because the interaction needs to be solved using numeric integration, with a symbolic solution this restriction disappear, opening the path to running this networks on less powerful servers, maybe even personal computers and phones.
manOnPavementWaving t1_iyo3sqa wrote
This doesn't hold for the networks currently in use, only if we want to more closely simulate human brains. There is no real indication yet that we can train these better or that they work better.
AvgAIbot t1_iyns2xs wrote
What about utilizing quantum computers? Or is that not applicable
Makingggserver t1_iyskaef wrote
what does that mean
Viewing a single comment thread. View all comments