手寫二元線性迴歸梯度下降—Python代碼

梯度下降法的python實現

代碼思路

  1. 導入所需庫
  2. 導入所需數據
  3. 數據轉換—提取X,Y
  4. 設置初始值
    事先指定係數值、學習率、迭代次數
  5. 定義損失函數(二元線性迴歸)
    J(θi)=12mi=1m[hθ(x(i))y(i)]2J(\theta_i)=\frac{1}{2m}\sum_{i=1}^{m}[h_\theta(x^{(i)})-y^{(i)}]^2其中hθ(x(i))=θ0+θ1x(1)+θ2x(2)h_\theta(x^{(i)})=\theta_0+\theta_1x^{(1)}+\theta_2x^{(2)}
  6. 求解梯度下降
    J(θi)θi=1mi=1m(hθ(x(i))y(i))x(i)=1mi=1m(θ0+θ1x(1)+θ2x(2)y(i))x(i)\frac{\partial J(\theta_i)}{\partial\theta_i}=\frac{1}{m}\sum_{i=1}^{m}(h_\theta(x^{(i)})-y^{(i)})*x^{(i)}=\frac{1}{m}\sum_{i=1}^{m}(\theta_0+\theta_1x^{(1)}+\theta_2x^{(2)}-y^{(i)})*x^{(i)}

代碼實現

  1. 導入所需庫
import numpy as np 
  1. 導入所需數據
data = np.genfromtxt("線性迴歸/data/Delivery.csv",delimiter=',')
  1. 數據轉換——提取X,Y
X = data[:,:-1]
Y = data[:,-1]
  1. 設置初始值
theta0 = 0
theta1 = 0
theta2 = 0
epochs = 100
alpha = 0.001
  1. 定義損失函數(根據上述公式編寫)
def cost(X,Y,theta0,theta1,theta2):
    loss = 0
    m = len(Y)
    for i in range(m):
        loss += (theta0+theta1*X[i,0]+theta2*X[i,1]-Y[i])**2
    loss = loss/(2*m)
    return loss
  1. 定義梯度下降(根據上述公式編寫)
def grad_des(X,Y,theta0,theta1,theta2,alpha,epochs):
    m = len(Y)
    for z in range(epochs):
        theta0_grad = 0
        theta1_grad = 0
        theta2_grad = 0
        for i in range(m):
            theta0_grad = (theta0+theta1*X[i,0]+theta2*X[i,1]-Y[i])
            theta1_grad = (theta0+theta1*X[i,0]+theta2*X[i,1]-Y[i])*X[i,0]
            theta2_grad = (theta0+theta1*X[i,0]+theta2*X[i,1]-Y[i])*X[i,1]
        theta0_grad = theta0_grad/m
        theta1_grad = theta1_grad/m
        theta2_grad = theta2_grad/m
    theta0 -=alpha*theta0_grad
    theta1 -=alpha*theta1_grad
    theta2 -=alpha*theta2_grad
    return theta0,theta1,theta2
  1. 輸出結果(係數值和損失值)
print('begin...theta0={},theta1={},theta2={},loss={}'.format(theta0,theta1,theta2,cost(X,Y,theta0,theta1,theta2)))
print('running...')
theta0,theta1,theta2 = grad_des(X,Y,theta0,theta1,theta2,alpha,epochs)
print('end...theta0={},theta1={},theta2={},loss={}'.format(theta0,theta1,theta2,cost(X,Y,theta0,theta1,theta2)))

數據資料

X1 X2 Y
100 4 9.3
50 3 4.8
100 4 8.9
100 2 6.5
50 2 4.2
80 2 6.2
75 3 7.4
65 4 6
90 3 7.6
90 2 6.1
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章