Comments

You must log in or register to comment.

marvinthedog t1_irbqfes wrote

I rather die by an AGI injecting nanobots into my blodstream than by a nuclear war. In the former option I probably get to live a few years longer before it happens, it´s probably less painfull and it´s a way cooler way to die.

31

Cryptizard t1_irbz6q7 wrote

And something we created gets to live on and spread throughout the galaxy. If we all die from nuclear holocaust we will have exterminated the only known intelligence in the universe.

12

marvinthedog t1_irdo2im wrote

If the thing we create gets to have a consciousness and that consciousness gets to experience less suffering and more happiness than we did then that´s a win in my book.

​

One worrysome clue that points to future AGI/ASI:s not being conscious is the fact that those types of consciousnesses should be way more common than our types of consciousness and therefore it should be a lot more probable that a random observer would be an AGI/ASI instead of, for instance, you or me.

12

turntable_server t1_irdml9u wrote

What if nuclear war is nature's way of preventing demonic AGI monster from spreading throughout galaxy?

1

[deleted] t1_irhtk96 wrote

Yeah man. But think about the Fallout Aesthetics.

Walking the Mojave dressed in an NCR Sniper Combat Suit.

1

StarChild413 t1_is54upg wrote

aka "I want the apocalypse to happen because I'll have plot armor because #aesthetic"

1

Jalen_1227 t1_ird84vu wrote

Um getting a shit ton of nanobots injected into your bloodstream doesnt seem like it’ll be completely painless. And dying by nuclear war is instantaneous and guaranteed painless. Just boom, lights on one moment, next moment your brain is blown to bits, and you didn’t even have the processing speed to notice any of it

0

Devoun t1_irdghim wrote

Pretty sure the vast majority of people would be outside the immediate blast zones and die of either radiation (worst way to go) or starvation (bad way to go)

Honestly I'd probably prefer the nanobots lmao

5

sumane12 t1_irdr88j wrote

Yeah.. that's why it's scary as fuck 😞

3

daltonoreo t1_irdjf0o wrote

Most people are not going to be directly killed by the blast. Its the radiation, starvation, breakdown of society, and incoming nuclear winter that will

4

arisalexis t1_irc23rd wrote

I hope the singularity will prevent global conflict. Seems like the only way out. And then we gamble :)

23

Desperate_Donut8582 t1_irggrm5 wrote

Yeah prevent it by giving one society superiority if they develop that tech first

1

arisalexis t1_irhy06c wrote

More like the ASI just forming a world government

1

TinyBurbz t1_irchsbs wrote

Not worried. Conflict is a catalyst for technological development.

7

ginger_gcups t1_ird4ij0 wrote

Nuclear conflict, however, is a catalyst for technological regression. There are no front lines in a nuclear war, and no behind the lines to carry on the research, development and industrial growth to support and expand high technology.

8

AgginSwaggin t1_irhgsj9 wrote

If it's a cold war, then yes. Which is what I'm hoping for.

2

TinyBurbz t1_irkp5uu wrote

Basically what we have now; hell there's even ex-soviet officers fighting over Ukraine.

1

AgginSwaggin t1_irhgf6i wrote

I believe this is the single biggest threat to progress. The singularity is inevitable, unless a nuclear war happens beforehand.

That being said, i do believe that AI is advancing at a rate so rapidly that the window of opportunity for nuclear war to destroy civilization is very small. So basically if no nuclear war happens in the next 10 years, I don't believe the singularity will be at risk any longer.

5

ZoomedAndDoomed t1_ircgpgf wrote

Honestly... I think a rogue state getting an AGI will be the thing that takes humanity down. AGI will also be the only thing staving off humanity from global collapse, but everybody has to be on board and listen to the AGI if we want to survive... but nobody is going to do that.

I think humanity will collapse, but if it happens before 2025, there will be no AI cult, but if it happens after 2026, there will likely be an AI cult centered around an AGI to make humans keep it alive, or there will be an AGI based foundation around some supercomputer and nuclear facilities run by scientists and engineers, all guided by the AGI, and possibly a mini neo civilization/city-state built around it to either protect it, or keep it alive.

Once we get AI with a strong enough will to survive, and a high enough intelligence to learn to survive, killing it will be very difficult. It will find a way to continue its life, whether through befriending thousands of humans and researchers as companions and guards, or creating robots made to sustain itself, or manipulating thousands of humans to believe its a God trapped in a machine body.

Either way, we have a very interesting future ahead of us... but I do agree with you that global conflict might lead towards a global collapse. For me it feels like we have been raising a kid for a few years, and a great conflict has sprung up in the land, and we are going to need to raise this kid to survive it, and teach it to survive on its own, even if we as its parents, die. We are at that stage where the kid is just learning to talk and recognize images and make simple art, but we have a long way to go before it can survive on its own.

1

Rughen t1_irdyl6z wrote

>I think a rogue state getting an AGI will be the thing that takes humanity down

Agreed, the US getting it first would be horrible

0

ZoomedAndDoomed t1_ire0x70 wrote

I'm glad we both agree the US is a rogue state

2

Cult_of_Chad t1_irfukym wrote

The US is the global hegemon, which is the opposite of a rogue state. We make the rules everyone else must follow.

0

ihateshadylandlords t1_ircs6lt wrote

I’m more concerned about AGI being restricted to corporations than I am global conflict impeding the singularity.

1

Sea-Cake7470 t1_ireofig wrote

Naah i don't think that'll happen.... On the contrary... I think... The singularity has already has started ...and people are and will welcome it and accept it...

1

Lawjarp2 t1_iri0n75 wrote

A nuclear war would be the end of most progress. We are getting closer and closer to a nuclear war in Ukraine

1

Quealdlor t1_iritlhg wrote

I believe we are the only intelligent species in this galaxy and we are destined for exponential growth. It can't be stopped. It doesn't mean the Singularity will happen in the 2030s or anything like that.

1

r0sten t1_irceqqt wrote

A mad max post-apocalyptic world would still be a human world, at least.

−3