How decision trees are trained π² β¦
A simple explanation of Gini Impurity and how itβs used to train decision trees. In brief:
Gini Impurity is the probability of incorrectly classifying a randomly chosen element in the dataset if it were randomly labeled according to the class distribution in the dataset
Click through for the full break down and some helpful examples.
Discussion
Sign in or Join to comment or subscribe