CART decision tree &Code

~~*Definition:* The generation of CART decision tree is the process of constructing binary decision tree recursively. CART uses Gini coefficient minimization criteria for feature selection and generates a binary tree.
Gini coefficient calculation:
在這裏插入圖片描述
code:

import matplotlib.pyplot as plt
import numpy as np
from sklearn.metrics import classification_report
from sklearn import tree

# 載入數據
data = np.genfromtxt("LR-testSet.csv", delimiter=",")
x_data = data[:,:-1]
y_data = data[:,-1]

plt.scatter(x_data[:,0],x_data[:,1],c=y_data) 
plt.show()
# 創建決策樹模型
model = tree.DecisionTreeClassifier()
# 輸入數據建立模型
model.fit(x_data, y_data)

# 導出決策樹
import graphviz # http://www.graphviz.org/

dot_data = tree.export_graphviz(model, 
                                out_file = None, 
                                feature_names = ['x','y'],
                                class_names = ['label0','label1'],
                                filled = True,
                                rounded = True,
                                special_characters = True)
graph = graphviz.Source(dot_data)

# 獲取數據值所在的範圍
x_min, x_max = x_data[:, 0].min() - 1, x_data[:, 0].max() + 1
y_min, y_max = x_data[:, 1].min() - 1, x_data[:, 1].max() + 1

# 生成網格矩陣
xx, yy = np.meshgrid(np.arange(x_min, x_max, 0.02),
                     np.arange(y_min, y_max, 0.02))

z = model.predict(np.c_[xx.ravel(), yy.ravel()])# ravel與flatten類似,多維數據轉一維。flatten不會改變原始數據,ravel會改變原始數據
print(xx.ravel())
z = z.reshape(xx.shape)
# 等高線圖
cs = plt.contourf(xx, yy, z)
# 樣本散點圖
plt.scatter(x_data[:, 0], x_data[:, 1], c=y_data)
plt.show()

predictions = model.predict(x_data)
print(classification_report(predictions,y_data))

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章