Submitted by [deleted] t3_10islii in singularity
[removed]
Submitted by [deleted] t3_10islii in singularity
[removed]
unless there is a huge recession or world war III sets in I don't think improvement is likely to stop
Recession won't stop it, ww3 might even speed it up.
I am afraid there would not be any Manhattan like projects the fear of the enemy being the first in creating a powerful entity like agi would impel them towards total annihilation of the other
Nukes
Ww3 will most certainly end civilization, if not the entire species. Experts are unsure whether a full scale nuclear exchange would kill every single human being, but certainly the vast majority.
You give nukes way more credit then they deserve.
The nuclear weapons are pointing at each other, the goal is to knock out the other guys capabilities without killing his civilian population (you need them alive to be hostages).
You can have a large war between major powers without it going nuclear, as long as the goal of the war isn't to dismantle each other.
Source: I was an Air Force Miniteman Operator
I was watching that recent talk sam altman did and he was saying they misjudged how big chatgpt was going to be and how they are going to be slower in releasing things in the future. That kind of disappointed me a bit because it sounds like they are going to be holding things back and just drip feeding them to us to not make as much of an impact and get people more comfortable with the gradual change.
The question is if the current approach (that is soaking up most of the investment and attention right now) is the end game, or is just a small piece.
Despite what this sub thinks, I don't know any experts truly believe that we have reached the end game. Even worse, no one truly knows what it will take to get to true AGI/ASI.
Personally, as someone with experience in software development and ML, I think that AI usage will continue to increase. It already has been increasing. But not in the way most people think. And AGI is a pipe dream at this point, I will be pleasantly surprised if I see it within my lifetime.
I respectfully am going with the opinions of those who have studied the effects of fallout and nuclear winter.
Yes we could have a large war without a nuclear exchange. That does not seem likely.
Other than nuclear war or meteor strike I don't think so. There certainly exists the possibility that some major roadblock will arise in the future but, I think it is unlikely.
The economic implications of the destruction of capitalism got them spooked.
... economic implications (and they imediate social consequences)...
More than just disappointing, it's outright disgusting. Change this shitty fucking world as fast as you can.
If we make no more progress with AI beyond today, and only implement what we already have. The world will be irreparably altered and our economies will be unrecognisable in twenty years time.
[deleted]
Interesting
But what if humanity uploaded their conciousness and merged with AI? Perhaps that is our future and our path of evolution we were destined to take? You can destroy our physical form but we as a species would still survive.
> you need them alive to be hostage
This is the big flaw in your argument. You are assuming two rational countries are having a friendly saber rattling thermonuclear exchange.
I can assure you, 100% assure you, that if Kim Jong Un was launching nukes at the US and thought he could get away with it, he would absolutely be targeting every large population center he could
[removed]
just-a-dreamer- t1_j5g91qq wrote
AI will move forward, no matter what happens in the world
When the USA droped the first nuclear bombs in 1945, nuclear technology was fated spread. Within 20 ywars all major powers had enough bombs to blow up the world and then some.
AI will kick off likewise. Too many companies and countries are pouring in resources to stop the train.