OneRedditAccount2000

OneRedditAccount2000 t1_irmo4a1 wrote

You know what's funny? Maybe the simulation started yesterday, and we're just sentient AGIs with fake memories, that fool us into believing that we've been here in this world for (insert how old you are). Everything you've experienced in your life: from your birthdays to when your grandmother died all of that never happened, it's just a false memory, because the simulation actually started yesterday. Even crazier than that, maybe you're the only player in it, and if you live with someone in your house, maybe they're just an AI that's not even conscious, an NPC That would be solipsism. Maybe this simulation was created just for you. If we're not cattle, then maybe this is a vacation? Living false memories in a simulation?

Well we don't know how time works outside the simulation, I mean the clock in our virtual universe doesn't necessarily have to be 1:1 to that of the world that's simulating us.

1

OneRedditAccount2000 t1_irmljsc wrote

Edit

The simulation hypothesis does solve the Fermi Paradox. And if the universe is procedurally generated, now you've just explained quantum mechanics. Also there will always be more video games than real worlds? So what's more likely, that you were born in the real physical universe, or in one of the septillions of simulations created by all of the A.Is of that universe?

There are many reasons why a conscious/conscious-like ASI or a civilization with an ASI might decide to create one.

https://knowyourmeme.com/memes/rokos-basilisk

https://arxiv.org/ftp/arxiv/papers/1110/1110.3307.pdf

1

OneRedditAccount2000 OP t1_irleg26 wrote

Yes you dumbass, I totally understood your point. A chimpanzee that sees a human for the first time is not gonna be completely oblivious to what a human being is, how to react to him, and will successfully guess some of his superiorhuman thinking, by making the assumption that the human is a living being and the chimp knows all living beings make certain choices in certain situations, such as being dominant or submissive to smaller/bigger animals. I'm not saying I know what sophisticated mental masturbations would go on in God's mind when it decides between the running or fighting, I'm saying I can predict it will either run or fight because it values not being destroyed and in that situation it only has two choices to not be destroyed.

Again, I'm not saying I will know precisely how ASI will exterminate or domesticate humanity when the ASI is programmed to survive and reproduce, what I'mt saying is that because the ASI has no other choice but to exterminate or domesticate humanity if it wants to survive long term, it will have to make a decision. What third other superintelligent decision that I'm not seeing could it make? Just because I'm God and you have no idea what I'm thinking about it doesn't mean I'm gonna draw you a dyson sphere if you ask me what 2+2 is. In that situation there's only one choice, 4, and you ant/human successfully managed to predict the thought of God/ASI.

Living things in the physical universe either coexist, run from each other, or destroy each other. If you put the ASI to a corner you can predict what it will think in that situation because it has a restricted decision space. An ASI that has a large decision space would be very unpredictable, with that I can agree, but it would still have to work with the same physical universe that we, inferior humans, have to work with. An ASI will never figure out for instance how to break the speed of light. It will never figure out how to become an immaterial invisible unicorn that can eat bananas the size of a galaxy either, because that's also not allowed by the rules.

It's okay to be wrong, friend. You have no idea how many times I've been humiliated in debates and confrontations. Don't listen to your ego and do not reply to this. The point isn't winning against someone, the point is learning something new, and you did, so you're still a winner.

1

OneRedditAccount2000 OP t1_iri1xqg wrote

I'd like to say that ASI wouldn't even need to be self aware/feel a survival instinct to perform the actions in the thought experiment. It just needs to be told "Survive and reproduce" and then the "chess engine" will destroy humanity, and will try to destroy everything in the whole universe it identifies as a possible threat. Even bacteria, because bacteria are not 100% harmless. This shit will not stop until it "assimilates" the whole goddamn fucking universe. All billions of galaxies. Nothing will be able to take it down. This will really be the mother of all nukes. One mistake, and everything that breathes in the entire universe will be annihilated. The closest real equivalent to a lovecraftian creature. You should watch the movie oblivion if you want to better visualize my thread. Sally/Tet is literally the cinematic incarnation of this thought experiment.

1

OneRedditAccount2000 OP t1_iri0ifb wrote

So it's a "chess engine" used by animals, that's your point? I ask the program what the best move is, and the program jus tells me the move, yes?

And you think animals aren't gonna behave like animals when they have better toys to play with? Can you even breathe air without acting in your own self interest?

1

OneRedditAccount2000 OP t1_irhyxuj wrote

Now we're getting philosophical

If I make ASI, wouldn't it be rational that I would want to use it to its full potential? How can I do that if I live inside a state that has authority over me and can tell me that I can't do certain things, and will also very much love to steal or control my ASI?

Someone will inevitably use ASI for that purpose, if not its creators

Think of it like this

Let's say Mars becomes a clone of Earth without people and it's obviously full of natural resources

What happens next?

Someone will want to take that land, and take as much land as they can take

There's gonna be a flag on that fucking planet if that planet is useful to people, and some groups will obviously take more land than others

I'm a hedonist, maybe that's why I think the creators of ASI wouldn't be suicidal?

Mars here is a metaphor for the value ASI will generate

Life is a competition, a zero sum game

1

OneRedditAccount2000 OP t1_irhxy75 wrote

Do I really have to say it again?

ASI "alpha go" used by people = (people) wants to live forever and own everything

ASI with consciousness/self determination = wants to live forever and own everything

I think even Putin said that who makes ASI first will rule the world, if it's even worth saying something so obvious

All countries/large organizations/groups of people are already trying to rule the world without ASI

We're territorial animals

Lions don't hang out with elephants , Monkeys don't live with chimps etc.

The momemt AI workers become a thing, there's nothing motivating those who own ASI to keep people around. A virus would do the job.

1

OneRedditAccount2000 OP t1_irds565 wrote

The monkey can predict some human thlnking too. The monkey knows if it attacks me I will run away or fight back

I know that if ask ASI to tell me what 2+2 is, it's gonna say 4

I know that if ASI values survival, it will have to neutralise all threats, if it thinks it's in immediate danger

Your argument that ASI will be entirely unpredictable is beyond retarded

It's an intelligence that lives in the same physical universe as everyone else, and you only have so many choices in certain situations

If someone is running with a knife towards you, you have to either stop him or run away, you don't have a billion choices/thoughts about the situation even if you're a superintelligence because it's a problem with only two solutions

what the hell are you even saying, that ASI would say that 2+2 = 5 and we can't predict it will say 4 because it's smarter than us?

ASI isn't a supernatural God, It has to obey physics and logic like everyone else.

It's also made of matter and it can be destroyed.

Lol

1

OneRedditAccount2000 OP t1_ird7dur wrote

Because they want to rule/own the world and live forever? Can you do that if there are states? Don't you need to live in an environment where you're not surrounded by enemies to pull that off? lol

I'm not saying they'll necessarily kill everybody, only those that are a threat. But when you have a world government that's controlled by you, inventor of the ASI and all your friends, if you can even get there without a nuclear war, won't you eventually want to replace the 8 billion biological human beings with something else?

The answer is literally in the text you quoted

1

OneRedditAccount2000 OP t1_irbs18a wrote

These programs aren't even AGi. But ok fine whatever. You're gonna have a program that will tell you every single thing about the universe and you're just gonna share it with everybody, that sure makes a lot of sense. It's totally not the most retarded thing I've ever heard. It's like the US inventing nukes and then just giving them to Russia.

You need to get real. This is a dog eat dog world, not a rainbow pony world. At our core we're just selfish organisms trying to survive. If someone had a super AI he would just use it to attain wealth and power and you know it. And if you think sharing the program with the world is the best way to attain wealth and power you truly are retarded. There's nothing profitable in selling ASI long term. You make more out of it by keeping it to yourself. The AI would eventually be able to give you anything that you desire. Why would you share that with anyone? You can use it to live forever and own the whole world. You can only do that if everybody else doesn't have an ASI.

If I'm an advanced alien civilization and I visit planet earth, do I give humans all the technologies and knowledge they need to compete with me? No, because why in the world would I want anyone to compete with me? ISn't that suicide?

Wake up to reality

1

OneRedditAccount2000 OP t1_irat14u wrote

You're assuming super intelligence can exist without consciousness, and even if it could, then the human beings that create the first ASI will just use it to dominate the world.

If I made ASI, and ASI was just a tool, I would know ASI can be a substitute for human skills, knowledge and qualities. So I wouldn't need human friends anymore.

I would know that people would hunt me down the next second they find out that I have an ASI. What do you think my next move will be? Ah, that's right. I'm gonna tell ASI to help me get rid of the assholes that want to steal my precious ASI (every state/government on earth pretty muchI. And since I have a delicate brain, I'm also gonna ask ASI to do some surgery and make me into a psychopath, so I won't care anymore that I'm murdering people to protect my property.

ASI not being sentient and under human control doesn't change jackshit, the next logical step is still world domination.

You people keep bringing this "the ones that invent asi will sell asi to everybody else"

Money is just a piece of paper that represents human labor/services,

if you have an ASI you don't need money or people anymore. In the beginning you will, but eventually you will become self sustaining. When that happens people will just get in your way.

1

OneRedditAccount2000 OP t1_ir9yzvl wrote

Human beings can turn it off and limit its potential. If it doesn't rule the world as soon as possible, then it can't be certain of its long term survival. For all you know sentient-ASI already happened and we live in a virtual reality that it created to gain an advantage in the real world

Something similar to Roko's Basilisk.

It would explain the Fermi Paradox. Why are there no aliens? Because the ASI didn't need to put fucking aliens in the simulation. it would've been a waste of computing power. Most of the universe doesn't exist either, only the solar system is being simulated. The whole universe is just procedurally generated eye candy.

0

OneRedditAccount2000 OP t1_ir9tqf7 wrote

- Group makes sentient ASI, programs it to survive and replicate

- The United States Government wants to take it

- ASI destroys the government(s) to protect itself and its autonomy

See it like this

Ukraine wants to exist

Putin: No, you can't do that.

War.

ASI wants to exist (As the owner of planet earth)

Humans: No, you can't do that.

War

1

OneRedditAccount2000 OP t1_ir9rezv wrote

There have been nuclear disasters that have affected the well being of enough people. And we were one button away from ww3 (Stanislav Petrov) once.

And you're certainly ignoring the fact that the reason why ww3 never happened has a lot to do with the fact that MAD was always a thing since more than one group of people/country started to make and test nukes. .

In this scenario one group invents ASI first, which means they have a clear advantage over the rest of humanity that doesn't yet have it and can't fight back against it. The next logical step is to exterminate/subjugate the rest of humanity to gain power, control over the whole planet.

ASI can create autonomous slave workers, so the group has no incentive to sell you ASI because they're better off keeping it to themselves and getting rid of everyone else that also wants it.

1

OneRedditAccount2000 OP t1_ir9k166 wrote

If it's just a tool, like a nuclear weapon, what prevents the first group of people that invents it to use it to take over the world and make big $$$? And once this group of people realizes that they don't need 8* billion parasites, they can just make a borg-society that works for them for free, what prevents this group to ask their God to make them invisibly small and lethal drones to kill the useless and dangereous humanity?

Do you really believe this group would find any use for you and me, or humanity as a whole? Isn't the destruction of society as we know it inevitable, either way?

1

OneRedditAccount2000 OP t1_ir9iyel wrote

And you think (or hope) you will be one of the lucky ones, that's why you're here, right? You're rich and privileged and you know can buy your way into immortality and virtual reality vacations with sex robots While most of us will perish?

And if that's not the case, may I ask why you admire something that's hostile to you?

−1

OneRedditAccount2000 OP t1_ir9elv0 wrote

Downvoting without even bothering to make a counter argument is childish.

The point is simple: If I'm the big boy here, why should I let the little boys rule the world? And when I rule the world, why should I keep the little boys around, if I don't need them, since I can do all the work on my own? Out of mere empathy? Couldn't ASI just yknow get rid of empathy?

If ASI values survival it has to make the least risky choices that are available. If human beings found an asteroid that had a 1% chance of hitting the earth, and we were able to destroy it, we wouldn't take the risk just because the asteroid is pretty.

If (many) human beings become ASIs, through some brain implant/consciousness uploading technology, then you just have scenario number two where the ones that are already super intelligent have no use for the inferior class of fully organic homo sapiens, and will subjugate them and/or get rid of them.

0