Viewing a single comment thread. View all comments

net_junkey t1_ja9ktk7 wrote

Reply to comment by billtowson1982 in So what should we do? by googoobah

AIs understand. Human brains learn concepts by forming a bundle of neurons dedicated to the concept of (lets say) "cat" based on the input of our senses - sight, smell...Modern AI's are designed to replicate the same process 1 to 1 on a software level. If anything they understand basic concepts better then humans.

The big jump right now is AIs understanding the relationship between concepts. Example: "cat" should be linked to the concept of "pet" and definitely not with the concept of "oven".

Problem is there are still kinks in the relationship between concepts part. AI is modeled on the human brain and the human brain is not a perfect system. In theory writing a simulation for the human Id, Ego, and Super- Ego and bundling it into a sentient AI package is quite doable. Making it happen while the foundations are still unstable is practically/near impossible.

1

billtowson1982 t1_jaa2f0n wrote

You don't know anything about AIs do you? I mean you read an article in USA today and now I'm having to hear you repeat things from it, plus some stuff you imagined to be reasonable extrapolations based on what your read.

0

net_junkey t1_jabo5vx wrote

The learning part of AI is based on/similar to how neurons learn. Once an AI has learned/been trained it stores data and filters for it on the hard drive.

How does a brain work? Data is written in neuron clusters (scientist have been able to find neuron bundles representing concept). The filters are neural connections coming out of those bundles. Brain optimises performance by strengthening commonly used connections and removing old unused ones.

Tained AI + continuous learning algorithm = basic brain even if only comparable to an insect.

1