Submitted by EchoXResonate t3_114xv2t in singularity
EchoXResonate OP t1_j8zikve wrote
Reply to comment by helpskinissues in What would be your response to someone with a very pessimistic view of AGI? by EchoXResonate
Do we have any safeguards against such a possibility? I’m not fully educated in ML and neural networks, so I can’t really imagine what such safeguards could be, but it can’t be that we’d be completely helpless against such an AGI?
helpskinissues t1_j8zj8fw wrote
It's a matter of scale. If we have an AGI that is 1 billion times smarter than a human, we have literally zero chance to do anything against it. Alignment or control is pointless.
However, I don't believe this is the correct discussion to have. This is just Terminator fear propaganda that, very unfortunately, is what people (like you and your friend) seem to have learned. And it's what most people talk on Reddit, unfortunately.
The actual reality is that we, humans, will evolve with AI. We will become different species, composed of biology+artificial intelligence.
This is not about "how can humans with primate brains can control very advanced AGIs??? they'll beat us!!". No. That's just absurd.
We will be the AGIs. The very AGI you fear will be part of your brain, not against you.
Why would you have an android at home that's 1 billion times smarter than you, rather than you augmenting your intelligence by 1 billion times?
https://en.wikipedia.org/wiki/Transhumanism
So yeah, your friend is right: a very advanced AGI will be unstoppable by humans. What I'd ask is: why would you want to stop the AGI? Become the AGI.
WikiSummarizerBot t1_j8zja3w wrote
>Transhumanism is a philosophical and intellectual movement which advocates the enhancement of the human condition by developing and making widely available sophisticated technologies that can greatly enhance longevity and cognition. Transhumanist thinkers study the potential benefits and dangers of emerging technologies that could overcome fundamental human limitations, as well as the ethics of using such technologies. Some transhumanists believe that human beings may eventually be able to transform themselves into beings with abilities so greatly expanded from the current condition as to merit the label of posthuman beings.
^([ )^(F.A.Q)^( | )^(Opt Out)^( | )^(Opt Out Of Subreddit)^( | )^(GitHub)^( ] Downvote to remove | v1.5)
Surur t1_j8zo74h wrote
> Why would you have an android at home that's 1 billion times smarter than you, rather than you augmenting your intelligence by 1 billion times?
Wont you have the same problem of a transhuman a billion times smarter than the other humans taking over the world? What is the difference really?
helpskinissues t1_j8zopek wrote
>Wont you have the same problem of a transhuman a billion times smarter than the other humans taking over the world?
Yep. So better inject the AGI inside your brain asap. That also happens with weapons, if a single person has 1 billion nuclear bombs and we have sticks and stones, we're fucked.
So we all better hurry up and join trascendence.
Viewing a single comment thread. View all comments