Submitted by flexaplext t3_127o4i0 in singularity
A lot of people consider the outcome of true AGI / ASI / The Singularity will most likely only swing one of two ways:
Either a) everyone dies or b) society trends towards a utopia.
- They're not sure which one it will be exactly, but they believe it will inevitably be one of these two.
This is the concept. An outcome at either extreme. And I think it really deserves a universal name if it doesn't have one (I don't know of one). Because it's going to get referred to a lot, it is already being referred to quite a lot now. But it would be a lot easier to refer to it if it had a known name.
People perhaps have a leaning towards one of the two outcomes, but that's irrelevant. I guess you could call them either positive or negative with respect to the concept.
NOTE: I'm not saying everyone believes this. I'm saying enough people do that it deserves its own name. So any suggestions?
At this point, it's almost a philosophy of its own*. I see lots of people saying that they are willing to chance ASI happening because it could lead to a utopia, and if it doesn't and kills us all, oh well. We're going to die anyway some day / life is pretty shit at the minute / life is just temporary. We're not going to be able to stop AI development, so we may as well just dive head first into it, flip the coin, and hope for the best.
Again, this is becoming such a largely accepted philosophy, that it really needs its own name based around the concept.
*P.S. Again, I'm not necessarily saying saying that I follow this line of thought myself. Just that I have seen a lot of people that are doing.
Intrepid_Meringue_93 t1_jef3cgp wrote
Alignment gamble