Multivariate Linear Regression代碼-Andrew NG Machine Learning Ex3

大神Andrew NG課程Exercise 2作業代碼,題目詳見網頁:


點擊打開鏈接


matlab實現代碼如下所示:

x = load('ex3x.dat');
y = load('ex3y.dat');
m = length(x);
x = [ones(m,1), x];

% Feature Scaling
sigma = std(x);
mu = mean(x);
x(:,2) = (x(:,2) - mu(2))./ sigma(2);
x(:,3) = (x(:,3) - mu(3))./ sigma(3);

alpha = [0.01, 0.03, 0.1, 0.3, 1.0, 1.3]; %% Your initial learning rate %%
plotstyle = {'b', 'r', 'g', 'k', 'b--', 'r--'};
J = zeros(50, 1); 

for k = 1:6
theta = zeros(size(x(1,:)))'; % initialize fitting parameters
for num_iterations = 1:50
    J(num_iterations) = ((x*theta-y)'*(x*theta-y))/(2*m); %% Calculate your cost function here %%
    theta = theta - alpha(k)*((x*theta-y)'*x)'/m;          %% Result of gradient descent update %%
end

if alpha(k)==1
    thetaBest = theta;
end
    
% now plot J
% technically, the first J starts at the zero-eth iteration
% but Matlab/Octave doesn't have a zero index
hold on;
plot(0:49, J(1:50),char(plotstyle(k)));
end
xlabel('Number of iterations');
ylabel('Cost J');
legend('0.01', '0.03', '0.1', '0.3', '1.0', '1.3');

price_grad_desc = dot(thetaBest, [1, (1650 - mu(2))/sigma(2), (3 - mu(3))/sigma(3)]);

% Normal Equation
% Reload Data.
x = load('ex3x.dat');
y = load('ex3y.dat');
m = length(x);
x = [ones(m,1), x];

theta_normal = (x' * x)\x' * y;
price_normal = dot(theta_normal, [1, 1650, 3]);


發佈了21 篇原創文章 · 獲贊 34 · 訪問量 21萬+
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章