Calm_Bonus_6464 t1_j1snyzn wrote
Reply to comment by OldWorldRevival in Will the singularity require political revolution to be of maximum benefit? If so, what ideas need to change? by OldWorldRevival
But we're not just talking about AGI here, Singularity would require ASI. Not just human level intelligence, but far beyond the intelligence capabilities of all humans who have ever lived. A being that intelligent would pretty easily be able to orchestrate political takeovers, or even destroy humans if it so desired.
OldWorldRevival OP t1_j1splv0 wrote
When I state "singularity requires political revolution to be of maximum benefit," I mean that the political changes have to come before the singularity.
Otherwise, the general benefits may be concentrated in the hands of an elite as the elite and those with resources continually lose the need for the masses with automation, as they're able to be self-sufficient with food, labor, etc.
But it could be worse, where an elite few control the AGI.
Or, lots of people become homeless, and then they're treated like homeless are now.
Calm_Bonus_6464 t1_j1srzke wrote
ASI does come before singularity. And ASI would solve much of those concerns. ASI has no reason to be any more benevolent to elites compared to anyone else. Elites cannot control a being that is far more intelligent than them. You're thinking AGI, not ASI, both have to happen before Singularity.
Viewing a single comment thread. View all comments