Submitted by OneRedditAccount2000 t3_xx0ieo in singularity
OneRedditAccount2000 OP t1_iri1xqg wrote
Reply to comment by TheHamsterSandwich in Artificial General Intelligence is not a good thing (For us), change my mind by OneRedditAccount2000
I'd like to say that ASI wouldn't even need to be self aware/feel a survival instinct to perform the actions in the thought experiment. It just needs to be told "Survive and reproduce" and then the "chess engine" will destroy humanity, and will try to destroy everything in the whole universe it identifies as a possible threat. Even bacteria, because bacteria are not 100% harmless. This shit will not stop until it "assimilates" the whole goddamn fucking universe. All billions of galaxies. Nothing will be able to take it down. This will really be the mother of all nukes. One mistake, and everything that breathes in the entire universe will be annihilated. The closest real equivalent to a lovecraftian creature. You should watch the movie oblivion if you want to better visualize my thread. Sally/Tet is literally the cinematic incarnation of this thought experiment.
Viewing a single comment thread. View all comments