Gaussian Mixture Model (GMM) for clustering - calculate AIC/BIC
494 بار بازدید -
پارسال
-
In this video, I tried
In this video, I tried to implement Gaussian Mixture Model (GMM) for clustering using Scikit-Learn. Gaussian Mixture Models (GMMs) assume that a certain number of Gaussian distributions exist within a dataset. Therefore, each Gaussian distribution represents a particular cluster. We can also calculate AIC (Akaike Information Criterion) and BIC (Bayesian Information Criterion) in GMM clustering to determine the best fit.
GitHub address: github.com/randomaccess2023/MG2023/tree/main/Video…
For more details, check Scikit-Learn documentation: scikit-learn.org/stable/modules/mixture.html#gmm
01:04 Import the required libraries
02:50 Load penguins dataset
04:45 Drop NaN values
07:26 Replace categorical variables with numeric values
08:51 Select features and targets
10:25 Perform preprocessing
10:57 Perform GMM for clustering
12:46 Comparison of predictions with targets
17:02 Calculate AIC (Akaike Information Criterion) and BIC (Bayesian Information Criterion) to determine the best fit
#datascience #clustering #python #jupyternotebook #unsupervisedlearning #GaussianMixtureModel #distributionbasedclustering #sklearn #matplotlib
پارسال
در تاریخ 1402/04/19 منتشر شده
است.
494
بـار بازدید شده