Beginner Intro to Neural Networks 6: Slope of the Cost Function

giant_neural_network
giant_neural_network
123.6 هزار بار بازدید - 7 سال پیش - In this video things get
In this video things get a bit more interactive!

We use three different techniques to find the slope of our cost function.

The first is a numerical approximation (not the best by any means, but works for quick illustrative purposes). We divide a change in our functions output by the change in its input, and the smaller the change in input (h) the better the approximation.

The second way we find the slope is by just expanding the definition of the slope using our current function. After simplification and dropping h as it goes to zero, we end up with a simple function that looks very similar to the original cost function. This is the true derivative of our cost function.

Finally we see a quick trick from calculus, the power rule, that helps us find the derivative. We bring the exponent out front and multiply by it, and decrement the original exponent by 1.

NOTE! I left out a second step to be brief in the video, but you have to remember to differentiate the stuff inside parenthesis as well with respect to b. -4 is a constant so it becomes 0, and the derivative of b with respect to b (think of the graph of b as b changes, what's its slope? 1!) is 1. You have to multiply this by your original power rule expression, so the total derivative becomes 2*(b-4)*1 which is the same as 2 * (b-4). Thanks to Mohamed H. Guelleh in the comments for making me realize I shouldn't leave out these steps regardless of whether the derivative is the same. We'll see this property more later when we have a neural network in place of b.

Thanks for watching! I'm trying out some interactive stuff in the animation program, that lets me evaluate simple expressions and slide numbers around. I hope it all makes sense enough!

Twitter: joncomo
7 سال پیش در تاریخ 1396/01/30 منتشر شده است.
123,670 بـار بازدید شده
... بیشتر