Submitted by fortunum t3_zty0go in singularity
Ortus12 t1_j1gg2ws wrote
Reply to comment by fortunum in Hype bubble by fortunum
The last Ai winter was caused by insufficient compute. We now have sufficient compute, and we've discovered that no new algorithmic advances are necessary, all we have to do is scale up compute for existing algorithms and intelligence scales along with it.
There are no longer any barriers to scaling compute because internet speeds are high enough that all compute can be server farms that are continually expanded. Energy costs are coming down towards zero so that's not a limiting factor.
The feedback loop now is Ai makes money, money is used for more compute, Ai becomes smarter and makes more money.
The expert systems of the 80s and 90s, grew too complex for dumb humans to manage. This is no longer a bottleneck because again, all you have to do is scale compute. Smart programmers can accelerate that by optimizing, and designing better data curation systems but again it's not even necessary. It's now a manual labor job that almost any one can be hired to do (plugging in more computers).
GuyWithLag t1_j1hgj0h wrote
Dude, no. Listen to the PhDs - the rapture isn't near, not yet at least.
On a more serious note: This is what the OP refers to when talking about a "hype bubble". The professionals working in the field actually know that the current crop of AI models are definitely not suitable for the architecture of AGI, except maybe as components thereof. Overtraining is a thing, and it's also shown that overscaling is also a thing. Dataset size is king, and the folks that create the headline-grabbing models already fed the public internet to the dataset.
From a marketing standpoint, there's the second-mover advantage: see what other did, fix issues and choose a different promotion vector. You're looking at many AI announcements in a short span due to the bandwagon effect, caused by a small number of teams showing multiple years' worth of work.
lil_intern t1_j1hnp2k wrote
If by rapture you mean evil robots taking ppl out their house then yes but what about millions of peoples careers becoming obsolete over night every other month due to AI growth in unexpected fields that seems pretty close
Ortus12 t1_j1hzcoy wrote
The current popular Ai models are only what works best on the current hardware.
We've already designed tons of different models that are outlined in many older Ai books, that can be used as compute scales (as Ai companies make more money to spend on more compute). Even the current models weren't invented recently, they're just now applicable because the hardware is there.
There's been a few algorithmic optimizations along the way a larger portion of the scale has been hardware.
2nd order companies are taking out 1st order companies by improving things, but that still keeps the ball moving forward.
ThePokemon_BandaiD t1_j1ipluc wrote
First of all, current big datasets aren't the full internet, just large subsections, specific datasets of pictures or regular text. We also generate about 100 zettabytes of new data on a yearly basis as of this year, and generative models can, with the help of humans to sort it for value for now, generate their own datasets. And while currently available LLMs and Image recognition and generation models are still quite narrow, stuff like gato, flamingo, etc have shown that at the very least multimodal models are possible with current tech, and imo it’s pretty clear that more narrow AI models could be combined together to create a program that acts as an AGI agent.
YesramDeens t1_j1jzcgo wrote
> Listen to the PhDs - the rapture isn't near, not yet at least.
Stop with this misinformation; for every three PhDs that are saying we will have an AI winter, there are six AI researchers at companies like OpenAI and Deepmind that are extremely excited about the potential of the devices they are creating.
Your unnecessary doomerism is borne from a sense of superiority and arrogance in knowledge. Don’t be humbled later on.
Viewing a single comment thread. View all comments