Except the fundamental requirement of the singularity is an AI that can write a better version of itself. If it does that infinitely then the end result is an AGI that would eclipse anything a human can do. The problem is what we define "better" as, because any small bias could get dramatically amplified into something dangerous.
bablakeluke t1_j57mc01 wrote
Reply to comment by Deepfriedwithcheese in How close are we to singularity? Data from MT says very close! by sigul77
Except the fundamental requirement of the singularity is an AI that can write a better version of itself. If it does that infinitely then the end result is an AGI that would eclipse anything a human can do. The problem is what we define "better" as, because any small bias could get dramatically amplified into something dangerous.