Submitted by OneRedditAccount2000 t3_xx0ieo in singularity
OneRedditAccount2000 OP t1_ir9k166 wrote
Reply to comment by Zamorak_Everknight in Artificial General Intelligence is not a good thing (For us), change my mind by OneRedditAccount2000
If it's just a tool, like a nuclear weapon, what prevents the first group of people that invents it to use it to take over the world and make big $$$? And once this group of people realizes that they don't need 8* billion parasites, they can just make a borg-society that works for them for free, what prevents this group to ask their God to make them invisibly small and lethal drones to kill the useless and dangereous humanity?
Do you really believe this group would find any use for you and me, or humanity as a whole? Isn't the destruction of society as we know it inevitable, either way?
Zamorak_Everknight t1_ir9ke9l wrote
If we are picturing doomer scenarios, then in that context I agree that it really isn't that different from, as you said, nuclear weapons.
Having said that, we seem to have a pretty good track record of not blowing up existence with our arsenal of nukes over the last ~ century.
OneRedditAccount2000 OP t1_ir9rezv wrote
There have been nuclear disasters that have affected the well being of enough people. And we were one button away from ww3 (Stanislav Petrov) once.
And you're certainly ignoring the fact that the reason why ww3 never happened has a lot to do with the fact that MAD was always a thing since more than one group of people/country started to make and test nukes. .
In this scenario one group invents ASI first, which means they have a clear advantage over the rest of humanity that doesn't yet have it and can't fight back against it. The next logical step is to exterminate/subjugate the rest of humanity to gain power, control over the whole planet.
ASI can create autonomous slave workers, so the group has no incentive to sell you ASI because they're better off keeping it to themselves and getting rid of everyone else that also wants it.
Zamorak_Everknight t1_irc8t3e wrote
>The next logical step is to exterminate/subjugate the rest of humanity to gain power, control over the whole planet.
How... is that the next logical step?
OneRedditAccount2000 OP t1_ird7dur wrote
Because they want to rule/own the world and live forever? Can you do that if there are states? Don't you need to live in an environment where you're not surrounded by enemies to pull that off? lol
I'm not saying they'll necessarily kill everybody, only those that are a threat. But when you have a world government that's controlled by you, inventor of the ASI and all your friends, if you can even get there without a nuclear war, won't you eventually want to replace the 8 billion biological human beings with something else?
The answer is literally in the text you quoted
Viewing a single comment thread. View all comments