上一節我們實現了二元線性迴歸。在這個練習中,會使用梯度下降和normal equations實現多元線性迴歸.同時也會檢查損失函數,收斂梯度和學習率的關係。參考Exercise: Multivariate Linear Regression我們要預測的是1650,且有3個bedrooms的房子的價格。
Data
先看看數據集,訓練集是Porland,Oregon地區的價格作爲
實驗基礎
梯度下降
- 假設函數依然如下:
- 梯度更新
損失函數:
損失函數的向量形式:
其中:
normal equations
使用這個公式不需要任何特徵縮放,你會得到一個確切的解決方案在一個計算:沒有像在梯度下降循環,直到收斂
程序實驗和結果
alpha =
0.0100 0.0300 0.1000 0.3000 1.0000 1.3000
theta_grad_descent =
1.0e+05 *
3.4041
1.1063
-0.0665
price_grad_descent =
2.9308e+05
normal equations
% normal equations
x = load('ex3x.dat');
y = load('ex3y.dat');
x = [ones(size(x,1),1), x]
theta_norequ = inv((x'*x))*x'*y
price_norequ = theta_norequ'*[1 1650 3]'
梯度計算:
%Multivariate Linear Regression
x = load('ex3x.dat');
y = load('ex3y.dat');
% add the x0=1 intercept term into your x matrix
x = [ones(size(x,1),1), x];
sigma = std(x); %standard deviations
mu = mean(x); % means
x(:,2) = (x(:,2) - mu(2))./ sigma(2);
x(:,3) = (x(:,3) - mu(3))./ sigma(3);
% gradient descent
figure
alpha = [0.01,0.03,0.1,0.3,1,1.3]
J =zeros(50, 1);
itera_num = 100;
sample_num = size(x,1);
plotstyle = {'b','r','g','k','b--', 'r--'};
theta_grad_descent = zeros(size(x(1,:)));
for alpha_i = 1:length(alpha)
theta = zeros(size(x(1,:)))';
Jtheta = zeros(itera_num,1);
for i = 1:itera_num
Jtheta(i) = (0.5/sample_num).*(x*theta - y)'*(x*theta - y);
grad = (1/sample_num).*x'*(x*theta - y);
theta = theta - alpha(alpha_i).*grad;
end
plot(0:49, Jtheta(1:50),char(plotstyle(alpha_i)), 'LineWidth', 2)
hold on
if (1==alpha(alpha_i))
theta_grad_descent = theta
end
end
legend('0.01', '0.03', '0.1', '0.3', '1', '1.3');
% now plot J
% technically, the first J starts at the zero-eth iteration
xlabel('Number of iterations')
ylabel('Cost J')
% predict price
price_grad_descent = theta_grad_descent'*[1 (1650-mu(2))/sigma(2) (3-mu(3))/sigma(3)]'