ThisIsMyStonerAcount t1_iz96wlt wrote
I think you only mean "ML", so I'll leave out symbolic approaches. I'll also mostly focus on Deep Learning as the currently strongest trend. But even then 20 papers wouldn't be enough to summarize a trajectory, but they'd be able to give a rough overview of the field.
Papers might not be the right medium for this, so I'll also use other publications. Off the top of my mind, it would be the publications that introduced (too lazy to look them up). In roughly temporal order from oldest to newest
- Bayes Rule
- Maximum Likelihood Estimation (this is a whole field, not a single paper, not sure where it got started)
- Expectation Maximization
- Perceptron
- Minsky's "XOR is unsolvable" (i.e., the end of the first "Neural Network" era)
- Neocognitron
- Backprop
- TD-Gammon
- Vanishing Gradients (i.e., the end of the 2nd NN era)
- LSTMs
- SVM
- RBMs (i.e., the start of Deep Learning and the 3nd NN era)
- ImageNet
- Playing Atari with Deep Reinforcement Learning
- Attention is All You Need
- AlphaGo
- GPT-3 (arguably this could be replaced by BERT, GPT-1 or GPT-2)
- CLIP
This is of course very biased to the last 10 years (because I lived through those).
acardosoj t1_iz97oqc wrote
You are right, MLE is the basis of everything and it's all work of Ronald Fisher, one the greatest statisticians of all time!
KingRandomGuy t1_iza65df wrote
This lines up with ImageNet, but I'd probably drop in AlexNet as well.
ThisIsMyStonerAcount t1_izakcfe wrote
That's actually what I meant, thanks for pointing it out! Edited
Viewing a single comment thread. View all comments