梯度下降可視化
前一篇看完了理論,我們來實戰一下,首先看一下梯度下降的效果
先看代碼
# 目標函數
def func(x):
return np.square(x)
# 目標函數一階導數
def dfunc(x):
return 2 * x
def GD_momentum(x_start, df, epochs, lr, momentum):
xs = np.zeros(epochs+1)
x = x_start
xs[0] = x
v = 0
for i in range(epochs):
dx = df(x)
v = -dx * lr + momentum * v # 計算w需要更新的值
x += v
xs.append(x) # 把新的x存儲下來
return xs
上面的代碼就是首先定義了一個目標函數,然後定義了目標函數的一階導數,最後我們利用梯度下降算法更新參數,並把每次新的參數存儲起來,已被畫圖,最終的結果就是如下所示:
我們可以看到新的參數取值是逐漸趨向於使目標函數最小化。看完了梯度下降,我們來做做一個二分類的小實驗。首先我們看一下數據,
前兩列是特徵,第三列是label,每一行就是一個樣本。
首先我們看一下數據的分佈。
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
np.random.seed(1234)
# 加載數據
def loaddata():
dataMat = []
labelMat = []
for line in open('./data.txt', 'r'):
line = line.strip().split()
dataMat.append([1.0, float(line[0]), float(line[1])])
labelMat.append(int(line[2]))
return dataMat, labelMat
# 顯示散點圖
def plotDataSet():
data, label = loaddata()
data = np.array(data)
xcord1 = []
ycord1 = []
xcord2 = []
ycord2 = []
for i in range(data.shape[0]):
if int(label[i]) == 1:
xcord1.append(data[i, 1])
ycord1.append(data[i, 2])
else:
xcord2.append(data[i, 1])
ycord2.append(data[i, 2])
fig = plt.figure()
ax = fig.add_subplot(111)
ax.scatter(xcord1, ycord1, s = 20, c = 'red', marker='s', alpha = 0.5)
ax.scatter(xcord2, ycord2, s = 20, c = 'green', alpha = 0.5)
plt.title('DataSet')
plt.xlabel('X1')
plt.ylabel('Y1')
plt.show()
我們看一下數據的分佈,如圖所示:
從圖上可以看出,數據的分佈很明顯,可以很輕鬆的分開。我們需要一個
def sigmoid(inX):
return 1.0 / (1 + np.exp(-inX))
def gradAscent(data, label):
data = np.mat(data)
label = np.mat(label).transpose()
alpha = 0.0001
maxEpoch = 700
m, n = np.shape(data)
weights = np.ones((n, 1))
for i in range(maxEpoch):
h = sigmoid(data * weights)
error = -(label - h)
weights = weights - alpha * data.transpose() * error
return weights.getA()
# 繪製結果
def plotBestFit(weights):
dataMat, labelMat = loaddata()
dataArr = np.array(dataMat)
n = np.shape(dataMat)[0]
xcord1 = []; ycord1 = []
xcord2 = []; ycord2 = []
for i in range(n):
if int(labelMat[i]) == 1:
xcord1.append(dataArr[i,1]); ycord1.append(dataArr[i,2])
else:
xcord2.append(dataArr[i,1]); ycord2.append(dataArr[i,2])
fig = plt.figure()
ax = fig.add_subplot(111)
ax.scatter(xcord1, ycord1, s = 20, c = 'red', marker = 's',alpha=.5)
ax.scatter(xcord2, ycord2, s = 20, c = 'green',alpha=.5)
x = np.arange(-3.0, 3.0, 0.1)
y = (-weights[0] - weights[1] * x) / weights[2]
ax.plot(x, y)
plt.title('BestFit')
plt.xlabel('X1'); plt.ylabel('X2')
plt.show()
根據我們求出的參數,劃出分割界限:
參考:
https://zhuanlan.zhihu.com/p/28922957
《統計學習方法》