Relu Leaky Relu and Swish Activation Functions || Lesson 8 || Deep Learning || Learning Monkey ||

Learning Monkey
Learning Monkey
3.1 هزار بار بازدید - 4 سال پیش - #deeplearning
#deeplearning#neuralnetwork#learningmonkey

In this class we discuss relu, leaky relu, and swish activation function.
Relu means rectified linear unit.
The equation is given as maximum of output and zero.
If the output is negative we take it as zero other wise output.
Here we don't have vanishing gradient problem.
But we get a problem of unused neurons.
Most of the derivative values are becoming zero.
So the neurons are not getting updated much.
To overcome we use leaky relu.
Instead of zero we take a small negative value changes with output.
Swish activation is a tradeoff between relu and step function.



Link for playlists:
@learningmonkey


Link for our website: https://learningmonkey.in

Follow us on Facebook @ Facebook: learningmonkey

Follow us on Instagram @ Instagram: learningmonkey1

Follow us on Twitter @ Twitter: _learningmonkey

Mail us @ [email protected]
4 سال پیش در تاریخ 1399/08/28 منتشر شده است.
3,194 بـار بازدید شده
... بیشتر