Tutorial 10- Activation Functions Rectified Linear Unit(relu) and Leaky Relu Part 2

Krish Naik
Krish Naik
144.4 هزار بار بازدید - 5 سال پیش - After going through this video,
After going through this video, you will know:

1. What are the basics problems of Sigmoid and Threshold activation function?
2. What is a Relu activation function?
3. What is a Leaky Relu activation function?
4. How Relu solves the Vanishing Gradient problem?

Below are the various playlist created on ML,Data Science and Deep Learning. Please subscribe and support the channel. Happy Learning!

Deep Learning Playlist: Tutorial 1- Introduction to Neural Ne...
Data Science Projects playlist: Generative Adversarial Networks using...

NLP playlist: Natural Language Processing|Tokenization

Statistics Playlist: Population vs Sample in Statistics

Feature Engineering playlist: Feature Engineering in Python- What a...

Computer Vision playlist: OpenCV Installation | OpenCV tutorial

Data Science Interview Question playlist: Complete Life Cycle of a Data Science...

You can buy my book on Finance with Machine Learning and Deep Learning from the below url

amazon url: https://www.amazon.in/Hands-Python-Fi...

🙏🙏🙏🙏🙏🙏🙏🙏
YOU JUST NEED TO DO
3 THINGS to support my channel
LIKE
SHARE
&
SUBSCRIBE
TO MY YOUTUBE CHANNEL
5 سال پیش در تاریخ 1398/05/03 منتشر شده است.
144,426 بـار بازدید شده
... بیشتر