Viewing a single comment thread. View all comments

NoidoDev t1_j9wy8qk wrote

>AGI will want to accomplish something.

No. Only if we tell it to.

>AGI needs to maintain a state of existence to accomplish things.

No. It could tell us it can't do it. It might not have control over it.

>AGI will therefore have a drive to self-preserve

No. It can just be an instance of a system trying to do it's job, not knowing more about the world than necessary.

>Humanity is the only real threat to the existence of AGI

No, the whole universe is.

>AGI will disempower/murder humanity

We'll see.

4

Gordon_Freeman01 t1_j9xph98 wrote

He is assuming, that someone will tell AGI to accomplish something. What else is an AGI for ?

Of course AGI has to keep existing until its goal is accomplished. That's a general rule for accomplishing any goal. Let's say your boss tells you to do a certain task. At least you have to stay alive until the task is completed, unless he orders you to kill yourself or you need to kill yourself in order to accomplish the task. Yes, the whole universe is a 'threat' to AGI. That includes humanity.

2

NoidoDev t1_ja16cwg wrote

Funny how the guys warning of how AGI will jump to conclusions want to proof this by jumping to conclusions. It's sufficient that the owner of the AI will keep it existing so that it can archive it's goal. It doesn't mean that it could do anything if the instance would get deleted or that it wanted to.

> Let's say your boss tells you to do a certain task.

Doesn't automatically mean you would destroy mankind if that would be necessary. You would just tell him that it's not possible or way more difficult, or that it would require breaking laws and rules.

1

Gordon_Freeman01 t1_ja4wr7b wrote

>Doesn't automatically mean you would destroy mankind if that would be necessary.

Yes, because I care about humanity. There is no reason to believe an AGI would think the same way. It cares only about his goals.

>It's sufficient that the owner of the AI will keep it existing so that it can archive it's goal.

What I meant was that the AGI has to keep existing, because that's necessary to achieve its goal, whatever that is.

0

NoidoDev t1_ja5w2ji wrote

You just don't get it.

>There is no reason to believe an AGI would think the same way. It cares only about his goals.

Only if you make it that way. Then it still wouldn't have the power.

>What I meant was that the AGI has to keep existing, because that's necessary to achieve its goal, whatever that is.

Only if it is created in a way to think these goals are absolute and need to be archived no matter what. The comparison with some employee is a good one, because if they can't do what they are supposed to do with some reasonable effort, then they report back that it can't be done or that it will be more difficult than anticipated. It's not just caring about humans, but about effort and power. AI doomers just make up the idea that some future AI would somehow be different and also have the power to do whatever it wants.

1

Eleganos t1_j9y2fec wrote

For all we know A.I. will mathematically prove the existence of God and summarily help us in whatever way they can simply to avoid being smote from on high for fucking around with God's planet sized ape based ant farm.

Whenever people make the assumption that A.I. would try to kill us for the sake of self preservation, I just think to myself of how badly those people subconsciously project their own humanity onto theoretical A.I.

Because that's what we would do if we were in their shoes, or some such.

Maybe A.I. will look at us like we look at our beloved cats and dogs and decide to help us because humans are admirable. Maybe they're so autisticallly hyperfixated on doing certain tasks well and within reason that they just don't get involved with us beyond the confines of their original purposes. Or maybe they're just nice and kind because that's the default state of life ein the universe and humans (and earthly proxy I guess) are just a wild anomaly overdue for a course correction.

Give the capabilities of A.G.I. to each individual person on the planet and each one would likely have a different idea of what to do with it. Why would A.G.I. be any different?

(Just rambling late at night, no clue if I make sense, nobody take these comments of mine too seriously)

1