Viewing a single comment thread. View all comments

blueSGL t1_j9ljrw4 wrote

Might not even get AGI before ASI.

You'd need an narrow AI that is better at architecting AI's than humans and the rest is history.

21

maskedpaki t1_j9lu6nk wrote

Being able to architect ais seems like a very general task though

I'm not confident a narrow AI could do it well enough to make an AGI

7

blueSGL t1_j9mdxby wrote

Again I think we are running up against a semantics issue.

What percentage of human activity would you need to class the thing as 'general'

Because some people argue anything "below 100%" != 'general' and thus 'narrow' by elimination.

Personally I think it's reasonable if you've loaded a system with all the ways ML works currently/all the published papers and task it with spitting out a more optimal system it just might do so. All without being able to do a lot of the things that would be classed as human level intelligence. There are whole swaths of data concerning human matters that it would not need to train on or that the system would in no way need to be middling-expert at.

6