Neural Networks Pt. 2: Backpropagation Main Ideas
526 هزار بار بازدید -
4 سال پیش
-
Backpropagation is the method we
Backpropagation is the method we use to optimize parameters in a Neural Network. The ideas behind backpropagation are quite simple, but there are tons of details. This StatQuest focuses on explaining the main ideas in a way that is easy to understand.
NOTE: This StatQuest assumes that you already know the main ideas behind...
Neural Networks: • The Essential Main Ideas of Neural Ne...
The Chain Rule: • The Chain Rule
Gradient Descent: • Gradient Descent, Step-by-Step
LAST NOTE: When I was researching this 'Quest, I found this page by Sebastian Raschka to be helpful: sebastianraschka.com/faq/docs/backprop-arbitrary.h…
For a complete index of all the StatQuest videos, check out:
statquest.org/video-index/
If you'd like to support StatQuest, please consider...
Buying my book, The StatQuest Illustrated Guide to Machine Learning:
PDF - statquest.gumroad.com/l/wvtmc
Paperback - www.amazon.com/dp/B09ZCKR4H6
Kindle eBook - www.amazon.com/dp/B09ZG79HXC
Patreon: www.patreon.com/statquest
...or...
YouTube Membership: youtube.comhttps://www.seevid.ir/fa/result?ytch=UCtYLUTtgS3k1Fg4y5tAhLbw/join
...a cool StatQuest t-shirt or sweatshirt:
shop.spreadshirt.com/statques...
...buying one or two of my songs (or go large and get a whole album!)
joshuastarmer.bandcamp.com/
...or just donating to StatQuest!
www.paypal.me/statquest
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
twitter.com/joshuastarmer
0:00 Awesome song and introduction
3:55 Fitting the Neural Network to the data
6:04 The Sum of the Squared Residuals
7:23 Testing different values for a parameter
8:38 Using the Chain Rule to calculate a derivative
13:28 Using Gradient Descent
16:05 Summary
#StatQuest #NeuralNetworks #Backpropagation
4 سال پیش
در تاریخ 1399/07/27 منتشر شده
است.
526,073
بـار بازدید شده