Lecture 6 (ID3,CART)

Lecture 6 (ID3,CART)

par CECILIA FIGUERAS TELLEZ,
Nombre de réponses : 0

Lecture 6: Compare the version of decision trees used by the sklearn software with the theory seen in class.

This link shows the differences between the algorithm ID3 and CART:  

https://scikit-learn.org/stable/modules/tree.html#tree-algorithms-id3-c4-5-c5-0-and-cart

The most remarkable differences from my personal point of view are:

- The algorithm ID3 creates a multiway tree while CART creates a binary tree.

- In ID3, features must be categorical so it doesn't admit regression. However, CART do it, as it supports numerical target variables.

- ID3 calculate the entropy or the information gain for each feature and selects the one that has a higher information gain or a smaller entropy. Otherwise, the CART uses the Gini index  to calculate the degree of impurity of each node. That is why the CART has the aim to generate partitions that are as homogeneous as possible.