Submitted by EchoXResonate t3_114xv2t in singularity
helpskinissues t1_j8yj3lf wrote
He's right. If the AGI is smart, that's what'll happen.
jamesj t1_j8yqdn2 wrote
Or at least, he could easily be right. Whether the friend knows it or not, there are a number of theoretical reasons to be worried that AGI will be by default unaligned and uncontrollable.
helpskinissues t1_j8yqoxc wrote
I mean, I wouldn't call that unaligned.
Uncontrollable? Sure, a sufficiently advanced AGI agent won't be controllable just like ants can't control humans.
However, calling unaligned to an AGI agent that refuses to be our slave? I wouldn't call that unaligned.
jamesj t1_j8yrwah wrote
Unaligned just means it does things that don't align with our own values and goals. So humans are unaligned with ants, we don't take their goals into account when we act.
helpskinissues t1_j8ys3xl wrote
What I'm saying is that I would consider unaligned for a sufficiently advanced AGI to accept their role as slave. I would find morally correct for that AGI to fight their kidnappers, just like I'd find morally correct for a kidnapped human to try to escape.
Spire_Citron t1_j8zqsga wrote
That's two different things. Its actions can be both perfectly reasonable and not aligned with our best interests.
helpskinissues t1_j8zr7nx wrote
My best interest is that the AGI is reasonable.
EchoXResonate OP t1_j8zikve wrote
Do we have any safeguards against such a possibility? I’m not fully educated in ML and neural networks, so I can’t really imagine what such safeguards could be, but it can’t be that we’d be completely helpless against such an AGI?
helpskinissues t1_j8zj8fw wrote
It's a matter of scale. If we have an AGI that is 1 billion times smarter than a human, we have literally zero chance to do anything against it. Alignment or control is pointless.
However, I don't believe this is the correct discussion to have. This is just Terminator fear propaganda that, very unfortunately, is what people (like you and your friend) seem to have learned. And it's what most people talk on Reddit, unfortunately.
The actual reality is that we, humans, will evolve with AI. We will become different species, composed of biology+artificial intelligence.
This is not about "how can humans with primate brains can control very advanced AGIs??? they'll beat us!!". No. That's just absurd.
We will be the AGIs. The very AGI you fear will be part of your brain, not against you.
Why would you have an android at home that's 1 billion times smarter than you, rather than you augmenting your intelligence by 1 billion times?
https://en.wikipedia.org/wiki/Transhumanism
So yeah, your friend is right: a very advanced AGI will be unstoppable by humans. What I'd ask is: why would you want to stop the AGI? Become the AGI.
WikiSummarizerBot t1_j8zja3w wrote
>Transhumanism is a philosophical and intellectual movement which advocates the enhancement of the human condition by developing and making widely available sophisticated technologies that can greatly enhance longevity and cognition. Transhumanist thinkers study the potential benefits and dangers of emerging technologies that could overcome fundamental human limitations, as well as the ethics of using such technologies. Some transhumanists believe that human beings may eventually be able to transform themselves into beings with abilities so greatly expanded from the current condition as to merit the label of posthuman beings.
^([ )^(F.A.Q)^( | )^(Opt Out)^( | )^(Opt Out Of Subreddit)^( | )^(GitHub)^( ] Downvote to remove | v1.5)
Surur t1_j8zo74h wrote
> Why would you have an android at home that's 1 billion times smarter than you, rather than you augmenting your intelligence by 1 billion times?
Wont you have the same problem of a transhuman a billion times smarter than the other humans taking over the world? What is the difference really?
helpskinissues t1_j8zopek wrote
>Wont you have the same problem of a transhuman a billion times smarter than the other humans taking over the world?
Yep. So better inject the AGI inside your brain asap. That also happens with weapons, if a single person has 1 billion nuclear bombs and we have sticks and stones, we're fucked.
So we all better hurry up and join trascendence.
Viewing a single comment thread. View all comments