Submitted by intergalacticskyline t3_xyb4h0 in singularity
TopicRepulsive7936 t1_irgsogq wrote
Reply to comment by sumane12 in When do you think we'll have AGI, if at all? by intergalacticskyline
Explain. How many definitions have you come across.
sumane12 t1_irhve4c wrote
Intelligence of smartest human.
Intelligence of dumbest human.
Intelligence of average human.
Human intelligence and sentient.
Human intelligence and not sentient.
Generalise from one task to a second.
Generalise from one task to multiple tasks.
Generalise from one task to every task achievable to a human.
Generalise from one task to every task achievable by every human.
The 'G' in AGI stands for general meaning any AI that is able to generalise skills from one to another, eg being trained on go, and transfering those skills to chess. That is the simplest definition of AGI.
TopicRepulsive7936 t1_iridvf0 wrote
Define all of those words you used I can't understand anything you said.
sumane12 t1_irif2r9 wrote
Google works
TopicRepulsive7936 t1_irifok4 wrote
Define google.
sumane12 t1_iriwdqi wrote
Yes
subdep t1_irhi2u5 wrote
AGI == Human level intelligence
The problem is that humans have a wide spectrum of intelligence. I believe you were making the first point, and the other person was trying to make the other point.
I think that we can all agree that regardless of AGI intelligence levels, once we get something smarter than any single human that’s ever existed, then we are in ASI territory.
[deleted] t1_irgu3su wrote
[deleted]
Viewing a single comment thread. View all comments