Viewing a single comment thread. View all comments

HeinrichTheWolf_17 t1_izvfw5c wrote

I’ve been saying it’s going to be a hard takeoff for 8 years now and everyone thought I was nuts. There’s no reason to assume an AGI would take as long to learn things just because the human brain does. Even Kurzweil is wrong here.

Writing is on the wall guys, we don’t have to wait until 2045.

23

-ZeroRelevance- t1_izvhqko wrote

The problem with hard takeoff is mostly computing power. If the AI is not software limited but hardware limited, then it would likely take quite a bit longer for the anticipated exponential growth to take place, as each iteration would require new innovations in computing and manufacturing to take place. AGI would definitely speed up that process significantly, but it would be far from instantaneous.

18

HeinrichTheWolf_17 t1_izvjeuz wrote

Software optimization plays a massive role though too, Stable Diffusion, OpenAI Five and AlphaZero were able to achieve the same performance on only a fraction of the required hardware they initially needed to run, the human brain can’t really do that. Assuming we do eclipse the power the brain via hardware soon, it’ll be quick that AGI shoots right past the human learning speed. Not only that, we’ll be giving it every initial GPU it needs until it can design it’s own hardware for itself.

I’d agree it won’t be instant, but it ain’t taking 20-30 years. The writing is on the wall.

17

-ZeroRelevance- t1_izvkafp wrote

Yeah, I get that. I probably didn’t convey it well enough in my original comment, but the main reason why I don’t think it’ll be as instantaneous as people think is because not only is having better designs available important, but you also need to manufacture them too. The manufacturing alone will probably take several months, even if you have a super intelligence behind the scenes, because you will need to develop new chip manufacturing devices and facilities, which are very finicky and expensive, find an appropriate facility, and then actually construct the thing, which takes labour time and also has logistical challenges. An idea/design alone won’t suddenly manifest a new next-gen supercomputer.

4

Talkat t1_izw467u wrote

Heh, I completely agree with you but I was thinking of when a human first learns a new skill it takes up all their brainpower and focus, but once mastered can be done without thought. Kinda like how getting an AI to do something first takes a lot of power but once we nail it we can reduce it signifigantly.

​

AGI will be able to optimize itself like no ones business. I think our hardware is powerful enough for an AGI... but to get there we will need more power as we can't write god like AI

3