Submitted by Impressive-Injury-91 t3_1118hkt in singularity
genericrich t1_j8dbwqt wrote
The main drivers for AI progress recently have been:
- Availability of massive amounts of structured data that is easily accessed via the Internet.
- Massive GPU farms in cloud infrastructure, used for the statistical math these AI systems need.
Most of the algorithms were written or understood back in the 60s, but everything was stored on paper back then, and there were no GPUs for fast matrix math.
PrivateUser010 t1_j8dld1f wrote
We have to pay homage to algorithmic improvement too. Neural Network Models like Transformers, Pretrained Transformers, Generative Adversarial Networks were all introduced in 2010-2020 decade and without those models, current changes would not be possible. So data, yes, processing power, yes, but models too.
FusionRocketsPlease t1_j8ee988 wrote
I wonder why these algorithms didn't come out in the 60's.
Agarikas t1_j8et372 wrote
Because there was no need
FusionRocketsPlease t1_j8etpqu wrote
There's been a lot of bizarre mental masturbation math being created since the 19th century.
[deleted] t1_j8f5goa wrote
[deleted]
SoylentRox t1_j8e7bvb wrote
This is false. None of the algorithms we use now existed. They were not understood. Prior versions of the algorithms that were much simpler did exist. It is chicken egg - we needed immense amounts of compute to find the algorithms needed to take advantage of immense amounts of compute.
FusionRocketsPlease t1_j8eeegf wrote
What do you mean computation is needed to discover algorithms?
SoylentRox t1_j8efx92 wrote
Many algorithms don't show a benefit unless used at large scales. Maybe "discover" is the wrong word, if your ml researcher pool has 10,000 ideas but only 3 are good, you need a lot of compute to benchmark all the ideas to find the good ones. A LOT of compute.
Arguably you "knew" about the 3 good ideas years ago but couldn't distinguish them from the rest. So no, you really didn't know.
Also transformers are a recent discovery (2017), it required compute and software frameworks to support complex nn graphs to even develop the idea.
genericrich t1_j8ebipv wrote
Gradient Descent was well understood in the early 20th century for fluid dynamics I believe.
So, not false. :)
SoylentRox t1_j8ecpwg wrote
But yes false? Your argument is like saying people in 1850 knew about aerodynamics and combustion engines.
Which, yes, some did. Doesn't negate the first powered flight 50 years later, it was still a significant accomplishment.
genericrich t1_j8edlrk wrote
<eyeroll> Nobody is saying there haven't been major changes in AI in the last few years. I certainly am not saying that.
But many of the underlying algorithms were well understood in different disciplines and the industry knew they would have application for AI, but the data and infrastructure just weren't there in the 60s or 1980s.
SoylentRox t1_j8eh4tu wrote
My point is that scale matters. A 3d multiplayer game was "known" to be possible in the 1950s. They had mostly offline rendered graphics. They had computer networks. There was nothing in the idea that couldn't be done, but in practice it was nearly completely impossible. The only thing remotely similar cost more than the entire manhattan project and they were playing that 3d game in real life. https://en.wikipedia.org/wiki/Semi-Automatic_Ground_Environment
If you enthused about future game consoles in the 1950s, you'd get blown off. Similarly, we have heard about the possibility of AI about that long - and suddenly boom, the dialogue of HAL 9000 for instance is actually quite straightforward and we could duplicate EXACTLY the functions of that AI right now, no problem. Just take a transformer network, add some stream control characters to send commands to ship systems, add a summary of the ship's system status to the memory it sees each frame. Easy. (note this would be dangerous and unreliable...just like the movie)
Also note that in the the 1950s there was no guarantee the number of vacuum tubes you would need to support a 3d game (hundreds of millions) would EVER be cheap enough to allow ordinary consumers to play them. The transistor had not been invented.
Humans for decades thought an AGI might take centuries of programming effort.
Villad_rock t1_j8dxtcr wrote
Thought google invented transformers?
PrivateUser010 t1_j8e0iai wrote
Google invented Transformers. It was released in 2015 I think. It was the first model focusing on massive attention mechanisms.
SoylentRox t1_j8e72bz wrote
2017...
Everything that mattered was the last few years. The previous stuff didn't work well enough.
savedposts456 t1_j8isobq wrote
Exactly. Attention is All You Need.
[deleted] t1_j8edot9 wrote
[deleted]
Viewing a single comment thread. View all comments