A Review of 10 Most Popular Activation Functions in Neural Networks

Machine Learning Studio
Machine Learning Studio
11.6 هزار بار بازدید - پارسال - In this video, I'll be
In this video, I'll be discussing 10 different activation functions used in machine learning, providing visualizations of their graphs and explaining the behavior of their derivatives. The list of activation functions covered includes:

1. Linear
2. ReLU
3. Leaky ReLU
4. Sigmoid (also known as the logistic sigmoid function)
5. Tanh (also known as the hyperbolic tangent function)
6. Softplus
7. ELU
8. SELU
9. Swish
10. GELU

By the end of the video, you'll learn which activation functions have continuous derivatives and which ones have discontinuous derivatives, as well as which ones are monotonic and which ones are non-monotonic functions.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Errata:
The output of sigmoid function is between [0, 1].
پارسال در تاریخ 1401/12/06 منتشر شده است.
11,604 بـار بازدید شده
... بیشتر