Implementing ReLU and Its Derivative from Scratch
860 بار بازدید -
2 سال پیش
-
In this video, we discuss
In this video, we discuss and implement ReLU activation function and its derivative using PyTorch.
Codebase: https://github.com/oniani/ai
GitHub: https://github.com/oniani
Web: https://oniani.ai
#ai #softwareengineering #programming #stylepoint #relu
Chapters
0:00 - Intro
0:33 - Discussing ReLU
6:40 - Computing the derivative of Sigmoid
8:21 - The API of the first approach
8:45 - Implementing `forward` method
9:14 - Implementing `backward` method
9:55 - Using `gradcheck` for testing
10:21 - The alternative implementation
13:05 - Outro
Codebase: https://github.com/oniani/ai
GitHub: https://github.com/oniani
Web: https://oniani.ai
#ai #softwareengineering #programming #stylepoint #relu
Chapters
0:00 - Intro
0:33 - Discussing ReLU
6:40 - Computing the derivative of Sigmoid
8:21 - The API of the first approach
8:45 - Implementing `forward` method
9:14 - Implementing `backward` method
9:55 - Using `gradcheck` for testing
10:21 - The alternative implementation
13:05 - Outro
2 سال پیش
در تاریخ 1401/07/05 منتشر شده
است.
860
بـار بازدید شده