Dive Into Deep Learning, Lecture 2: PyTorch Automatic Differentiation (torch.autograd and backward)

Dr. Data Science
Dr. Data Science
14.8 هزار بار بازدید - 3 سال پیش - In this video, we discuss
In this video, we discuss PyTorch’s automatic differentiation engine that powers neural networks and deep learning training (for stochastic gradient descent). In this section, you will get a conceptual understanding of how autograd works to find the gradient of multivariable functions. We start by discussing derivatives, partial derivatives, and the definition of gradients. We then discuss how to compute gradients using requires_grad=True and the backward() method. Thus, we cover classes and functions implementing automatic differentiation of arbitrary scalar-valued and non-scalar-valued functions. We also discuss the Jacobian matrix in PyTorch. Differentiation is a crucial step in nearly all machine learning and deep learning optimization algorithms. While the calculations for taking these derivatives are straightforward, working out the updates by hand can be a painful and tedious task.

#Autograd #PyTorch #DeepLearning
3 سال پیش در تاریخ 1400/06/21 منتشر شده است.
14,842 بـار بازدید شده
... بیشتر