How to Choose an Activation Function for Neural Networks

Mısra Turp
Mısra Turp
3.8 هزار بار بازدید - 2 سال پیش - Vanishing/Exploding Gradients are two of
Vanishing/Exploding Gradients are two of the main problems we face when building neural networks. Before jumping into trying out fixes, it is important to understand what they mean, why they happen and what problems they cause for our neural networks. In this video, we will learn what it means for gradients to vanish or explode and we will take a quick look at what techniques there are in order to deal with vanishing or exploding gradients. Previous lesson:    • How to Choose the Correct Initializer...   Next lesson:    • How Does Batch Normalization Work   📙 Here is a lesson notes booklet that summarizes everything you learn in this course in diagrams and visualizations. You can get it here 👉 misraturp.gumroad.com/l/fdl 👩‍💻 You can get access to all the code I develop in this course here: github.com/misraturp/Deep-learning-course-repo ❓To get the most out of the course, don't forget to answer the end of module questions: fishy-dessert-4fc.notion.site/Deep-Learning-101-Qu… 👉 You can find the answers here: fishy-dessert-4fc.notion.site/Deep-Learning-101-An… RESOURCES: 🏃‍♀️ Data Science Kick-starter mini-course: www.misraturp.com/courses/data-science-kick-starte… 🐼 Pandas cheat sheet: misraturp.gumroad.com/l/pandascs 📥 Streamlit template (updated in 2023, now for $5): misraturp.gumroad.com/l/stemp 📝 NNs hyperparameters cheat sheet: www.misraturp.com/nn-hyperparameters-cheat-sheet 📙 Fundamentals of Deep Learning in 25 pages: misraturp.gumroad.com/l/fdl COURSES: 👩‍💻 Hands-on Data Science: Complete your first portfolio project: www.misraturp.com/hods 🌎 Website - misraturp.com/ 🐥 Twitter - twitter.com/misraturp
2 سال پیش در تاریخ 1401/08/18 منتشر شده است.
3,888 بـار بازدید شده
... بیشتر