Viewing a single comment thread. View all comments

Dr_seven t1_jadny3h wrote

>“Brains also have an amazing capacity to store information, estimated at 2,500TB,” Hartung added. “We’re reaching the physical limits of silicon computers because we cannot pack more transistors into a tiny chip. But the brain is wired completely differently. It has about 100bn neurons linked through over 10^15 connection points. It’s an enormous power difference compared to our current technology.”

This part in particular made me squint a little bit.

For starters, we don't fully grasp how memory works in the brain, but we know it isn't like mechanical/electrical memory, with physical bits that flip. It seems to be tied to the combinations of neurons that fire, of which there are essentially infinite permutations, leading to the sky-high calculations of how much "data" the brain can hold....but it doesn't hold data like that, at least not for most humans.

The complexity of this renders it impractical to easily model on anything less than the largest supercomputers, and even then, we aren't actually modeling brain activity in the sense that we know why Pattern X leads to "recalling what that stroganoff tasted like on April 7, 2004".

The reason this is important is because it means that, while we may be able to stimulate neurons in a lab in a way that makes them useful for data storage, it isn't necessarily the same way that human brains store information- indeed, human memory would be a horrible baseline for a computer, considering the brain's preference towards confabulation of details at the time of recall that are not consistent with the reality. Most people's memories of most things are inaccurate, but close enough to work out alright. That's the exact sort of thing you don't want from a computer's memory.

This is compelling stuff, but we have a long way to go before we even understand what we are dealing with in practical terms.

14