Decision Trees Geometric Intuition | Entropy | Gini impurity | Information Gain

CampusX
CampusX
80.7 هزار بار بازدید - 3 سال پیش - Decision Trees use metrics like
Decision Trees use metrics like Entropy and Gini Impurity to make split decisions. Entropy measures the disorder or randomness in a dataset, while Gini Impurity quantifies the probability of misclassifying a randomly chosen element. Information Gain, derived from these metrics, guides the tree in selecting the most informative features for optimal data splits, contributing to effective decision-making in classification tasks.

============================
Do you want to learn from me?
Check my affordable mentorship program at : https://learnwith.campusx.in/s/store
============================

📱 Grow with us:
CampusX' LinkedIn: LinkedIn: campusx-official
CampusX on Instagram for daily tips: Instagram: campusx.official
My LinkedIn: LinkedIn: nitish-singh-03412789
Discord: Discord: discord
E-mail us at [email protected]

⌚Time Stamps⌚

00:00 - Intro
00:14 - Example 1
03:00 - Where is the Tree?
04:00 - Example 2
06:09 - What if we have numerical data?
07:57 - Geometric Intuition
10:50 - Pseudo Code
11:54 - Conclusion
14:00 - Terminology
14:53 - Unanswered Questions
16:16 - Advantages and Disadvantages
18:04 - CART
18:45 - Game Example
21:45 - How do decision trees work? / Entropy
22:15 - What is Entropy
25:40 - How to calculate Entropy
29:40 - Observations
31:35 - Entropy vs Probability
36:20 - Information Gain
41:40 - Gini Impurity
50:30 - Handling Numerical Data
3 سال پیش در تاریخ 1400/04/11 منتشر شده است.
80,744 بـار بازدید شده
... بیشتر