python實現梯度下降法

這個專題裏面默認讀者瞭解算法原理,只展示算法代碼的實現,如有疑問,參考對應書籍。或者第二週:梯度下降法的向量化推導

import numpy as np
import matplotlib.pyplot as plt
X = np.array([np.ones(100),np.random.rand(100)])
y = np.dot([4, 3], X) + np.random.rand(100)
plt.scatter(X[1, :], y)
plt.show()
print(X.shape)
print(y.shape)
print('gradient descent start')
alpha = 0.01
num_iter = 1000


def gradient_descent(theta, X, y, alpha, num_iter):
    loss_history = np.zeros(num_iter)
    theta_history = np.zeros((num_iter, 2))
    m = len(y)
    for i in range(num_iter):
        y_pred = np.dot(theta, X)
        theta = theta - alpha/m*np.dot((y_pred - y), X.T)
        loss = 1/(2*m) * np.sum(np.square(y_pred - y))
        theta_history[i, :] = theta
        loss_history[i] = loss
        if i%100 == 1:
            print('theta is',theta)
            print('current loss is',loss)
    return theta, theta_history, loss_history




theta_ini = np.random.randn(2)
theta, theta_history, loss_history = gradient_descent(theta_ini,X,y,alpha, num_iter)

residual = np.zeros((len(y)-1))
for i in range(len(y)-1):
    residual[i] = loss_history[i] - loss_history[i+1]

print(theta)

plt.plot(loss_history)
plt.show()
plt.plot(residual)
plt.show()

效果如圖:
我們二次迴歸這個散點圖:
在這裏插入圖片描述
收斂如下:
在這裏插入圖片描述

發佈了73 篇原創文章 · 獲贊 21 · 訪問量 1萬+
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章