Understanding Gated Recurrent Unit (GRU) Deep Neural Network
49 بار بازدید -
5 سال پیش
-
You've seen how a basic
You've seen how a basic RNN works.In this video, you learn about the Gated Recurrent Unit which is a modification to the RNN hidden layer that makes it much better capturing long range connections and helps a lot with the vanishing gradient problems.Let's take a look.You've already seen the formula for computing the activations at time t
5 سال پیش
در تاریخ 1398/03/06 منتشر شده
است.
49
بـار بازدید شده