Lec 4 Activation Function - Step functions to leaky ReLU the journey Neural Network
453 بار بازدید -
4 سال پیش
-
- What is the role
- What is the role of activation function in a neural network ?
- What are some of issues with initial functions like Step and Activation function?
- How Sigmoid and tanh works and what are the issues of gradient saturation, vanishing gradient?
- How this issues is solved by using ReLU ? What is the dead neuron issue?
- How this is resolved using leaky ReLU ?
- What is the use of a softmax function?
- A summary of issues and applications of all of them
- What are some of issues with initial functions like Step and Activation function?
- How Sigmoid and tanh works and what are the issues of gradient saturation, vanishing gradient?
- How this issues is solved by using ReLU ? What is the dead neuron issue?
- How this is resolved using leaky ReLU ?
- What is the use of a softmax function?
- A summary of issues and applications of all of them
4 سال پیش
در تاریخ 1399/03/20 منتشر شده
است.
453
بـار بازدید شده