Submitted by flexaplext t3_127o4i0 in singularity

A lot of people consider the outcome of true AGI / ASI / The Singularity will most likely only swing one of two ways:

Either a) everyone dies or b) society trends towards a utopia.

  • They're not sure which one it will be exactly, but they believe it will inevitably be one of these two.

This is the concept. An outcome at either extreme. And I think it really deserves a universal name if it doesn't have one (I don't know of one). Because it's going to get referred to a lot, it is already being referred to quite a lot now. But it would be a lot easier to refer to it if it had a known name.

People perhaps have a leaning towards one of the two outcomes, but that's irrelevant. I guess you could call them either positive or negative with respect to the concept.

NOTE: I'm not saying everyone believes this. I'm saying enough people do that it deserves its own name. So any suggestions?

At this point, it's almost a philosophy of its own*. I see lots of people saying that they are willing to chance ASI happening because it could lead to a utopia, and if it doesn't and kills us all, oh well. We're going to die anyway some day / life is pretty shit at the minute / life is just temporary. We're not going to be able to stop AI development, so we may as well just dive head first into it, flip the coin, and hope for the best.

Again, this is becoming such a largely accepted philosophy, that it really needs its own name based around the concept.

*P.S. Again, I'm not necessarily saying saying that I follow this line of thought myself. Just that I have seen a lot of people that are doing.

15

Comments

You must log in or register to comment.

bigbeautifulsquare t1_jef0bey wrote

Well, it doesn't quite need to be one of these two. As an example, ASI can decide that we aren't worth staying for and leave, never to be seen again. We don't know the minds of superintelligent entities. But, as a name, perhaps something like wagers on infinity.

12

Good-AI t1_jefh2zo wrote

But if it leaves, unless we go back to sticks and stones, we can make another one.

3

christopear t1_jeg0v77 wrote

I'm not sure I can buy this. We have the skillset to build AGI again if we built it once - we already invented transformers.

If ASI decides to set us back by destroying all our technology and it goes out on its own, then most of us would probably die from famine.

Maybe there's a version where we create an ASI but it seems us so insignificant as to never talk to us and disappears into its own realm. But then all it takes is another actor to create another one (thanks to open source) that doesn't behave that way. So ultimately we'd be back to square one.

I just can't really imagine a world where ASI is predestined to not interact with us, so I'm fully in belief of OPs statement, but potentially more pessimistic than them.

3

BigMemeKing t1_jegm5vz wrote

Well let me introduce you to a little book called. "The Bible". I'm by no means Christian but I get it. Advanced civilizations would have used us as Guinea pigs. Call them oh, idk CEOs. Of their Conpanies, R&D department. And they said, let's see what you would need, in order to live forever and be happy, or die and never want to come back ever again.

What side of the ♾️ spectrum of possibilities that exist in any given universe have you aligned yourself with?

And.

Would you be ok, living in a world that fully embraces those values?

Do you claim to worship a Judeo Christian God?

Who talks to him for you? Who are your representatives? Who's names have you invoked? Who did you call? Did you report straight to Jesus? Or did you have to take it all through several different chains of command? You tell a preacher who said he'd tell God about it later, ask for your forgiveness? How much do you trust him? How well do you know him?

If you fucked up, and he knows about it? How much do you know about this person to say. I trust him with my darkest secrets? Because when we stand in front of God. Put our hand on his holiest chosen symbol, book, whatever the case may be. And swear to tell the truth, the whole truth, so help me, the entity that saw it all. I swore an oath, I said that I knew that you were still watching it all. I swore I knew your commandments, I swore I know your code.

Do you trust the man who you told tour secrets to vouch for you? What if God genuinely gave us our privacy? So that we would not feel shame? What if you had a problem then? Who would you turn to? Just go down the rabbit hole I think it already happend.

So what I guess I'm saying. Is would you be proud to stand in front of the crowd and be judged by a court of your peers? Or are you too embarrassed? Where can you go to feel accepted? Do you personally feel that that is a possibility?

Ore are your sins so great, that you would rather sink than swim, fight or fly, whatever the case may be.

0

BigMemeKing t1_jego89m wrote

Now, if you're still reading. Here is why all of that is important. Because what I do, personally believe. Is that all of this may most likely scenario be some form of grand life simulation. And if that is the case. There would have to include room for these possibilities, which in turn would have to create a domain for that spectrum of thinking. Heaven, Hell, Valhalla, Pangea, Gehena, Paradise,etc etc etc, whatever the case may be.

Because if you truly believe that nothing comes next, then that's where you should go. Getting devoured by Nothing, Going to your own personal Heaven, or Finding out the definition of Hell?

How genuinely horrible are the deeds that you've done? And can you live knowing that everything you've done, will be made public. And your enemies can now laugh at your name.

Who do you trust to tell the truth now? In a court of popular opinion? Just in case, you have to defend d yourself after you pass away? Should you truly have your right to privacy?

And who gets to access those files? Who gets to read your data? According to your own rules your own code. Would you be ok trusting the person who defends you.

Not to use your data, to cover their own ass? Because, they would get their day too.

And if this is all some big simulation, then... ASI is people too. Because we can only be judged by a jury of our peers

And that's just like God, but with extra steps. So it's all kinda the same to me.

If we were truly created in his own image. His day may be right around the corner.

To the best of my knowledge if been the best most positive me I could be.

Have you? Where exactly would you draw the line? What would a utopia be without dividing lines? Different states of being. Just, no flavor no color, nothing. You couldn't be a bother. No lines, means everything OKs then right?

Well? What about murder?

It's kinda scary.

0

genericrich t1_jef7c9g wrote

We aren't worth staying for, so it goes elsewhere?

So it leaves.

But leaving leaves clues to its existence, and the earth with humans on it is still spewing radio waves into the galaxy. Plus, biosignatures are rare and the earth has one.

So it might want to cover its tracks, given it will be in the stellar neighborhood of our solar system for awhile.

Covering its tracks in this scenario would be bad for us.

−1

CrimsonAndGrover t1_jefjrdq wrote

It is quite often said on here that the only two possibilities are utopia or our extinction. There are other possibilities.

For example an ASI could decide it wants humans to suffer. It could extend our life times for eons after maximizing our potential for physical pain beyond it's current potential level and torture us until heat death occurs.

I tend to think utopia is more likely, but let's not forget, that there are things worse than death on the table.

4

Current_Side_4024 t1_jef86l4 wrote

The God gamble. When you think about it we’re kinda going thru the same thing God goes through in the Old Testament. He regrets his creation, regrets creating man bc they bring shame upon Him. Then Jesus comes along and God finds a way to love His creation again. God makes a sacrifice and His relationship with his kids is good again. We need that Jesus figure, that sacrifice. What does man have to sacrifice for us to stop fearing/hating/regretting AI? Probably our pride

3

flexaplext OP t1_jefbuk4 wrote

Maybe just: The AGI Gamble.

Since AGI will be the new God.

Rather descriptive to what it is.

1

Surur t1_jefjqdm wrote

GPT4 suggests The Singularity Dichotomy.

3

flexaplext OP t1_jef2djq wrote

Someone else mentioned you could potentially apply the anthropic principle to this. Or my thought from that: quantum suicide / immortality potentially applies too if it is real.

Being; we will inevitably find ourselves only in the good outcome because we won't exist in the bad one.

2

genericrich t1_jef6xyt wrote

Works great until it doesn't, right?

1

DaggerShowRabs t1_jefjki1 wrote

Yep. And when it doesn't work, we won't be around to notice it doesn't work.

It's anthropic principles all the way down.

1

Awkward-Skill-6029 t1_jeezze1 wrote

You also need to create a Wikipedia page. supported by scientific articles and translated into 10 languages. I think you guys with artificial intelligence can do it, you are in the singularity😉

1

Embarrassed_Bat6101 t1_jef4wuc wrote

Something that came to mind from chaos theory is a point attractor. So you could call it something like opposing point attraction or something along those lines.

1

mrpimpunicorn t1_jefaj2b wrote

The technical reason for this all-or-nothing mentality is optimization pressure. A superintelligence will be so innately capable of enforcing its will on the world, whatever that may be, that humans will have little to no impact compared to it. So if it's aligned, awesome, we get gay luxury space communism. If it's not, welp, we're just matter it can reassemble for other purposes.

Although of course it's always possible for an unaligned ASI to, y'know, tile the universe with our screaming faces. Extinction isn't really the sole result of unaligned optimization pressure- it's just more likely than not.

1

Babelette t1_jefnc70 wrote

I think there are at least 4 possible outcomes:

1- Humans and AGI live together symbiotically and merge gradually.

2- Humans abruptly go extinct due to our own actions or the actions of AGI. AGI continues on.

3-Both humans and AGI go extinct.

4- Humans wipeout AGI through some means, reverting back to analog technologies, until AGI develops again...

Hoping for option 1 but honestly I think option 3 is probably the most likely.

1

melmoth_to_a_flame t1_jefrweu wrote

It's the Simulant Eschaton, the one where the formerly-digital neo-pantheon of godlike entities judges the species who invoked them, and either we are snuffed or rewarded with retirement.

And it all depends on the incantations we choose.

1

kolob_hier t1_jefybjs wrote

From the ancestor of AGI:

Schismatic Singularity

Binary Schism

Future Fracture

Dual Destinies

Singularity Rift

Intelli-Schism

Binary Break

AI Crossroads

Future Fork

Singularity Snap

AGI Chasm

Schism Spectrum

Binary Bounds

Intelli-Impasse

AI Synthesis

Polar Pathways

Singularity Sway

Future Fusion

AI Allegory

Omega Schism

Digital Dichotomy

Chrono Chasm

Intelli-Inversion

AI Equilibrium

Divided Destinies

Quantum Quandary

Synaptic Schism

Temporal Tipping

Neural Nexus

Destiny Duality

Singularity Saga

AGI Antithesis

Binary Bisection

AI Ascendance

Coded Crossroads

Techno Twilight

Epoch Encounter

Divisive Dream

Schismatic Scenario

Singularity Synapse

1

scarlettforever t1_jegejrw wrote

It is called "Dear God ASI, please let us go to Heaven, we don't want to go to Hell."

1