Submitted by froggygun t3_126otke in singularity
[removed]
Submitted by froggygun t3_126otke in singularity
[removed]
>we are just getting more and more efficient by the day meaning anyone will be able able to run GPT-n perf on their hardware soon.
yes, anyone will be able to train neural networks but not the kind to make simps like musk tremble with fear. open ai spent 7 million on cloud computing costs alone to train gpt. it would be a trivial (and misguided) task to shut down future ai development.
Actually no, https://twitter.com/summerlinARK/status/1599196885675544576?lang=de. And this is still an underestimate because predicting 10 years in algorithmic advances in the field of AI is silly. And that doesn't even account for distillation, more publicly available datasets and models, multi-LLM systems,... There are so many dimensions in which this train is running, it makes you dizzy thinking abt it and makes regulation look like nothing more then pure cope.
it's a suggested pause of 6 months not 7 years. plus, i agree the pause a dumb idea.
Elon wouldnt focus on any problems other than problems to his bank account. Grow up. These people are not special.
I don't think it's going to happen. Everyone is too concerned about the competition, and people who are actively developing this are convinced it's safe enough.
Don't worry about Musk. He's just a sad, pathetic little man with too much money.
Elon Musk is in favor of a six month pause, not killing all AI. You're thinking of Yudkowsky. Don't worry, it's doubtful that anything is going to come of this. Nobody is going to take away AI or even slow it down.
> Elon Musk is in favor of a six month pause
so that his team can continue working on it and get ahead
Oh okay thats a little better atleast. I thought he wanted to shut it down completely and ban it. I guess I misread or saw something since I was stressed
They want a 6 month pause on training these large language models. It's utopian thinking, not consistent with capitalism.
Ever hear about what happens when a new batch of powerful heroin would hit the streets, killing some junkies via overdose? The other junkies go looking for that shit, because something bad isn't going to happen to *them*, right? Tragedy is for the other poor bastards.
That's what's at work here. Capitalism doesn't allow them to slow down with AI development, no matter what the risk is. In fact, for VCs and C-suite tech company execs (basically the same tribe), risk is exactly what they want. Risk equals reward.
They don't believe that the risk is existential for the human race. They can't believe that. If they admit this possibility, they open the door to introducing ethics and morality into their business decisions, which in this case they cannot do, since they fear their competitors will not be similarly bound.
There's no slowing down. Nobody is pausing anything, regardless of how good an idea it might be.
This isn't even taking into account the military and intelligence services, who are almost certainly investing mega millions into LLM development. You can bet that the NSA is balls-deep in this field.
All this letter does is pour more chum into the water.
There are also far too many national security and, unfortunately, global political consequences to stopping or pausing. Politicians and other heads of state know that the stakes are too high.
Lets say US and EU agree to pause AI research for a year. That just means that we will give China and Russia one year headstart. The genie is out of the bottle and we have to deal with it. Im more confident if western democracies develop this technology than if some dictatorship does it. I do however agree that we are playing with fire.
Sure_Cicada_4459 t1_jea3juc wrote
Good news, it's literally impossible. Even the assumption of that it's feasible to track GPU accumulation and therefore crack down on training runs above a certain size is very brittle. The incentive for obfuscation aside, we are just getting more and more efficient by the day meaning anyone will be able able to run GPT-n perf on their hardware soon. Even many signatories acknowledge how futile it is, but just want to signal that something needs to be done for whatever reasons (fill in your blanks).
Bad news, there is a non-trivial risk of this dynamic blowing up in our faces I just don't think restrictions are the way to go.