bablakeluke

bablakeluke t1_j57mc01 wrote

Except the fundamental requirement of the singularity is an AI that can write a better version of itself. If it does that infinitely then the end result is an AGI that would eclipse anything a human can do. The problem is what we define "better" as, because any small bias could get dramatically amplified into something dangerous.

12