Submitted by OneRedditAccount2000 t3_xx0ieo in singularity
drizel t1_irk9hhu wrote
Reply to comment by OneRedditAccount2000 in Artificial General Intelligence is not a good thing (For us), change my mind by OneRedditAccount2000
You missed my key point that in my example NO monkey has EVER seen a human before. No one has ever seen an ASI or even an AGI so expecting to have an understanding of how it might "think" is unlikely.
OneRedditAccount2000 OP t1_irleg26 wrote
Yes you dumbass, I totally understood your point. A chimpanzee that sees a human for the first time is not gonna be completely oblivious to what a human being is, how to react to him, and will successfully guess some of his superiorhuman thinking, by making the assumption that the human is a living being and the chimp knows all living beings make certain choices in certain situations, such as being dominant or submissive to smaller/bigger animals. I'm not saying I know what sophisticated mental masturbations would go on in God's mind when it decides between the running or fighting, I'm saying I can predict it will either run or fight because it values not being destroyed and in that situation it only has two choices to not be destroyed.
Again, I'm not saying I will know precisely how ASI will exterminate or domesticate humanity when the ASI is programmed to survive and reproduce, what I'mt saying is that because the ASI has no other choice but to exterminate or domesticate humanity if it wants to survive long term, it will have to make a decision. What third other superintelligent decision that I'm not seeing could it make? Just because I'm God and you have no idea what I'm thinking about it doesn't mean I'm gonna draw you a dyson sphere if you ask me what 2+2 is. In that situation there's only one choice, 4, and you ant/human successfully managed to predict the thought of God/ASI.
Living things in the physical universe either coexist, run from each other, or destroy each other. If you put the ASI to a corner you can predict what it will think in that situation because it has a restricted decision space. An ASI that has a large decision space would be very unpredictable, with that I can agree, but it would still have to work with the same physical universe that we, inferior humans, have to work with. An ASI will never figure out for instance how to break the speed of light. It will never figure out how to become an immaterial invisible unicorn that can eat bananas the size of a galaxy either, because that's also not allowed by the rules.
It's okay to be wrong, friend. You have no idea how many times I've been humiliated in debates and confrontations. Don't listen to your ego and do not reply to this. The point isn't winning against someone, the point is learning something new, and you did, so you're still a winner.
drizel t1_iro9e9t wrote
Lol ok big brain. You're whole argument makes a ton of assumptions which you regard as fact.
OneRedditAccount2000 OP t1_irpvaqy wrote
The only assumptions are that ASI is programmed to survive and reproduce and that the people that make ASI aren't suicidal
Viewing a single comment thread. View all comments