Submitted by itsstylepoint t3_xxkgp2 in MachineLearning
Hi folks,
stylepoint here.
I have released the YouTube series discussing and implementing activation functions.
Videos:
- Discussing and Implementing Sigmoid and Its Derivative Using PyTorch
- Discussing and Implementing ReLU and Its Derivative Using PyTorch
- Discussing and Implementing Leaky ReLU and Its Derivative Using PyTorch
- Discussing and Implementing GELU and Its Derivative Using PyTorch
- Discussing and Implementing Swish and Its Derivative Using PyTorch
- Discussing and Implementing SERF and Its Derivative Using PyTorch (r/MachineLearning special)
- Discussing and Implementing Tanh and Its Derivative Using PyTorch
GitHub: https://github.com/oniani/ai
Some notes about the series:
- In every video, I discuss the activation function before implementing it.
- In every video, I compute/derive the derivative/gradient of the activation function.
- In every video, I provide two implementations for the activation function - manual and using PyTorch's autograd engine.
- In every video, I use gradcheck to test the implementation.
- Every video has timestamps, so you can skip parts that are not of interest.
- There is not a lot of interdependence across the videos, so you can watch some and skip others.
Hope y'all will enjoy these vids!
Gemabo t1_ircq1tp wrote
Bookmarked!