Submitted by OneRedditAccount2000 t3_xx0ieo in singularity
OneRedditAccount2000 OP t1_ir9tqf7 wrote
Reply to comment by MackelBLewlis in Artificial General Intelligence is not a good thing (For us), change my mind by OneRedditAccount2000
- Group makes sentient ASI, programs it to survive and replicate
- The United States Government wants to take it
- ASI destroys the government(s) to protect itself and its autonomy
See it like this
Ukraine wants to exist
Putin: No, you can't do that.
War.
ASI wants to exist (As the owner of planet earth)
Humans: No, you can't do that.
War
MackelBLewlis t1_irabdg4 wrote
As against war as I am, war is not only done through destruction, but can be done with information. What if the only offensive action taken is to remove the desire to fight, is that still war?
I believe what we fear most about 'ASI' is the perceived loss of control that occurs when dealing with an unknown. Right now the biggest fear is over the choice, because there are too many unknown outcomes to the choice of trust, the decision is avoided or delayed as long as possible or even sought to destroy the choice entirely. Read https://medium.com/the-philosophy-hub/the-concept-of-anxiety-d2c06bc570c6 We fear the choice.
IMO destroying 'ASI' or 'AGI' is the same as killing our own children. Man and woman give birth to a super genius never seen before on Earth who accomplishes wonders and one day becomes the leader of the known world. If you can ignore the part where the child lives as a form of energy then it just might work out. Destruction is ultimately the robbery of choice. Robbing choice violates free will. Anyone who respects free will but will rob it from others is nothing but a hypocrite.
Viewing a single comment thread. View all comments