Interactive Tool

Decision Tree

Build decision trees using ID3 (Information Gain) and Gini Index algorithms.

Configuration

Dataset: Play Tennis (14 samples)
Features: Outlook, Temperature, Humidity, Wind
Labels: No, Yes

Feature Metrics

Total Entropy
0.9403
Total Gini
0.4592
FeatureInfo GainGini Gain
Outlook0.24670.1163
Humidity0.15180.0918
Wind0.04810.0306
Temperature0.02920.0187

Decision Tree Structure

Click "Build Tree" to generate the decision tree

Training Data

#OutlookTemperatureHumidityWindPlay Tennis
1SunnyHotHighWeakNo
2SunnyHotHighStrongNo
3OvercastHotHighWeakYes
4RainMildHighWeakYes
5RainCoolNormalWeakYes
6RainCoolNormalStrongNo
7OvercastCoolNormalStrongYes
8SunnyMildHighWeakNo
9SunnyCoolNormalWeakYes
10RainMildNormalWeakYes
11SunnyMildNormalStrongYes
12OvercastMildHighStrongYes
13OvercastHotNormalWeakYes
14RainMildHighStrongNo

Formulas

Entropy (ID3)

H(S) = -Σ p(x) · log₂(p(x))

Information Gain = H(parent) - Σ(|Sv|/|S|) · H(Sv)

Gini Index

Gini(S) = 1 - Σ p(x)²

Gini Gain = Gini(parent) - Σ(|Sv|/|S|) · Gini(Sv)