Submitted by Dat_koneh-98 t3_zmnmji in singularity

Even though I am well versed in software engineering and computer science I lack in-depth knowledge about the challenges on the road to AGI. So, go easy on me if it is unreasonable question.

What I am curious about is, mammalian brain can only transmit data at max 120m/s and can only switch at 1000hz while a modern computer can do trillions of calculations per-sec and even at the speed of light if you are talking about a quantum computer. So, is it possible one clever programmer can make simulate the interaction of billions of neurons by just "reusing" cores in the GPU and context switching between different threads that will act like a multiple neuron at different instances. One core but switching so fast that, it acts like a different neuron within the same minute. I understand people a lot smarter than I am wouldn't be asking more and more processing power if it was that simple but I am genuinely curious if that is a viable future of an AGI that can run on a simple computer with a GPU that can be found in any household.

Thank you.

10

Comments

You must log in or register to comment.

Accomplished_Diver86 t1_j0c6z16 wrote

Hm good question. I don’t know for sure.

One Problem that could however arise from this is that not only the neuron acts as a transmitter but the position of it is data in itself.

So if you reuse the neurons for multiple actions at the same time, you might be loosing information compared to a human brain due to the neuron being the same and not locally situated somewhere else.

That’s all I got though and I am not an expert. Just my thoughts as I know that we don’t understand the brain enough to even answer your question in a reliable manner

2

94746382926 t1_j0cfvwd wrote

It's an open question. Many opinions on it but no one knows for sure.

2

Desperate_Food7354 t1_j0cokz7 wrote

As far as I know we already have the processing power to simulate the human brain, just not the software. A google search says the human brain runs at around 1 exaFlop, and the new frontier supercomputer runs at 1.1 exaFlops. Also, apparently there's one in China that's 5x that.

2

AsheyDS t1_j0ds0t4 wrote

Software is the bigger issue, but we don't want to simulate a human brain to create AGI.

1

mocha_sweetheart t1_j0fmcwd wrote

Yeah, simulating a human brain isn’t really the only way to create AGI considering all the inefficiencies of the human brain etc. for example Neural networks I think can have thousands of connections for a single neuron unlike the brain which only has a few for each one. Simulating a human brain would just be the way we know from the examples we have on earth for sapience, but it’s not at all the best way.

May I pm you btw?

1

AsheyDS t1_j0hawx7 wrote

It's not only inefficient, it's also unwanted. We have a lot of biases that we don't want to include, and there are biological processes that simply don't need to be included. A digital intelligence can operate very differently and more efficiently, and in a lot of ways the human brain is constrained by it's own biology, so we don't need to be replicating those constraints either. The only potential use in mapping the brain in regards to AGI is if it can shed any light on human behaviors to make interaction easier.

And yeah feel free to pm me if you'd like.

2