Comments

You must log in or register to comment.

just-a-dreamer- t1_j5g91qq wrote

AI will move forward, no matter what happens in the world

When the USA droped the first nuclear bombs in 1945, nuclear technology was fated spread. Within 20 ywars all major powers had enough bombs to blow up the world and then some.

AI will kick off likewise. Too many companies and countries are pouring in resources to stop the train.

18

whatsinyourhead t1_j5gfi5b wrote

I was watching that recent talk sam altman did and he was saying they misjudged how big chatgpt was going to be and how they are going to be slower in releasing things in the future. That kind of disappointed me a bit because it sounds like they are going to be holding things back and just drip feeding them to us to not make as much of an impact and get people more comfortable with the gradual change.

8

TheSecretAgenda t1_j5gp2r6 wrote

The economic implications of the destruction of capitalism got them spooked.

12

AsuhoChinami t1_j5hf792 wrote

More than just disappointing, it's outright disgusting. Change this shitty fucking world as fast as you can.

8

marcellux314 t1_j5g9dx0 wrote

unless there is a huge recession or world war III sets in I don't think improvement is likely to stop

7

PanzerKommander t1_j5gaxdq wrote

Recession won't stop it, ww3 might even speed it up.

10

sticky_symbols t1_j5gdat2 wrote

Ww3 will most certainly end civilization, if not the entire species. Experts are unsure whether a full scale nuclear exchange would kill every single human being, but certainly the vast majority.

10

PanzerKommander t1_j5gfe3g wrote

  1. You give nukes way more credit then they deserve.

  2. The nuclear weapons are pointing at each other, the goal is to knock out the other guys capabilities without killing his civilian population (you need them alive to be hostages).

  3. You can have a large war between major powers without it going nuclear, as long as the goal of the war isn't to dismantle each other.

Source: I was an Air Force Miniteman Operator

6

sticky_symbols t1_j5goljf wrote

I respectfully am going with the opinions of those who have studied the effects of fallout and nuclear winter.

Yes we could have a large war without a nuclear exchange. That does not seem likely.

6

YobaiYamete t1_j5iwmqw wrote

> you need them alive to be hostage

This is the big flaw in your argument. You are assuming two rational countries are having a friendly saber rattling thermonuclear exchange.

I can assure you, 100% assure you, that if Kim Jong Un was launching nukes at the US and thought he could get away with it, he would absolutely be targeting every large population center he could

2

wildechld t1_j5iit2h wrote

But what if humanity uploaded their conciousness and merged with AI? Perhaps that is our future and our path of evolution we were destined to take? You can destroy our physical form but we as a species would still survive.

1

marcellux314 t1_j5gcmj7 wrote

I am afraid there would not be any Manhattan like projects the fear of the enemy being the first in creating a powerful entity like agi would impel them towards total annihilation of the other

1

[deleted] OP t1_j5ggtjz wrote

The question is if the current approach (that is soaking up most of the investment and attention right now) is the end game, or is just a small piece.

Despite what this sub thinks, I don't know any experts truly believe that we have reached the end game. Even worse, no one truly knows what it will take to get to true AGI/ASI.

Personally, as someone with experience in software development and ML, I think that AI usage will continue to increase. It already has been increasing. But not in the way most people think. And AGI is a pipe dream at this point, I will be pleasantly surprised if I see it within my lifetime.

6

TotalMegaCool t1_j5hpy08 wrote

If we make no more progress with AI beyond today, and only implement what we already have. The world will be irreparably altered and our economies will be unrecognisable in twenty years time.

4

TheSecretAgenda t1_j5gonad wrote

Other than nuclear war or meteor strike I don't think so. There certainly exists the possibility that some major roadblock will arise in the future but, I think it is unlikely.

1