Backpropagation in RNN | Backpropagation through time

Coding Lane
Coding Lane
27.3 هزار بار بازدید - 3 سال پیش - In this video, we will
In this video, we will understand Backpropagation in RNN. It is also called Backpropagation through time, as here we are backpropagating through time.

Understanding Backpropagation in RNN helps us to know how Recurrent Neural Networks work. Also, It is important to understand the vanishing gradient problem that occurs in RNN.

We will look at the general equation for computing Backpropagation in RNN, and we will also see how to find gradients with respect to weights in Recurrent Neural Networks.

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

Timestamps:
0:00 Intro
0:30 Forward Propagation
3:24 gradient w.r.t Wya
5:08 gradient w.r.t Waa
7:27 gradient w.r.t Wax
8:23 End

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

Follow my entire playlist on Recurrent Neural Network (RNN) :

📕 RNN Playlist: What is Recurrent Neural Network in D...

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

✔ CNN Playlist: What is CNN in deep learning? Convolu...

✔ Complete Neural Network: How Neural Networks work in Machine L...

✔ Complete Logistic Regression Playlist: Logistic Regression Machine Learning ...

✔ Complete Linear Regression Playlist: What is Linear Regression in Machine ...

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

If you want to ride on the Lane of Machine Learning, then Subscribe ▶ to my channel here: @codinglane
3 سال پیش در تاریخ 1400/11/02 منتشر شده است.
27,383 بـار بازدید شده
... بیشتر