[deleted] t1_je2duuc wrote
There are two separate futurism ideas here.
Kurzweil’s “Law of Accelerating Returns” is the idea that tech advances exponentially.
The Singularity is the idea that tech will get to a point where it’s advancing so fast that nothing about the future is predictable at all.
They’re related, but AI is just one exponential tech. There are lots of signs that tech in general is growing exponentially.
As for AI and the Singularity, we haven’t reached the hard takeoff point yet, which may or not be possible. If it’s possible, it’s the point at which AGI emerges and starts to recursively improve itself. That creates an intelligence explosion, and results quickly in a Singularity. But it’s not the only thing that could cause a singularity, it’s just the most obvious one.
Arowx OP t1_je4kpnz wrote
What if AI allows us to use it as a tool to faster approach AGI?
It's kind of like the chicken and egg problem. What came first the AGI or the AI toolkit that allowed the AGI to evolve faster from an AI?
Viewing a single comment thread. View all comments