Submitted by OneRedditAccount2000 t3_xx0ieo in singularity
OneRedditAccount2000 OP t1_irhyxuj wrote
Reply to comment by TheHamsterSandwich in Artificial General Intelligence is not a good thing (For us), change my mind by OneRedditAccount2000
Now we're getting philosophical
If I make ASI, wouldn't it be rational that I would want to use it to its full potential? How can I do that if I live inside a state that has authority over me and can tell me that I can't do certain things, and will also very much love to steal or control my ASI?
Someone will inevitably use ASI for that purpose, if not its creators
Think of it like this
Let's say Mars becomes a clone of Earth without people and it's obviously full of natural resources
What happens next?
Someone will want to take that land, and take as much land as they can take
There's gonna be a flag on that fucking planet if that planet is useful to people, and some groups will obviously take more land than others
I'm a hedonist, maybe that's why I think the creators of ASI wouldn't be suicidal?
Mars here is a metaphor for the value ASI will generate
Life is a competition, a zero sum game
Viewing a single comment thread. View all comments