線性迴歸的知識點:(圖片看不清可以在新標籤打開或者保存)
·
線性迴歸如何編程實現:(Matlab)
1、Cost Function函數:可以用來觀察每次的Cost是否下降(把每一次梯度下降遞歸的結果保存起來,作圖就可以觀察到了),便於選擇合適的訓練速率alpha
function J = computeCost(X, y, theta)
m = length(y); % number of training examples
J=sum((X*theta-y).^2) /(2*m);
end
2、Gradient descent函數:循環調用,不斷修正theta的值,直到收斂
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
theta = theta - X'*(X*theta-y) * alpha/m;
% Save the cost J in every iteration
J_history(iter) = computeCostMulti(X, y, theta);
end
end
3、Feature Scaling:對訓練樣本X,因爲不同樣本代表的信息不同,大小差距很大,一般要進行縮放或者歸一化
function [X_norm, mu, sigma] = featureNormalize(X)
mu = mean(X); % 保存平均值 後面預測新的樣本時用到,因爲新的樣本也要先Normalize處理
sigma = std(X,0,1); % 保存標準差 後面預測新的樣本時用到
X_norm = (X-repmat(mean(X),length(X),1))./repmat(sigma,length(X),1);
end
4、有了以上三個函數就可以求出theta了,也就是求出了我們的直線,最後讓我們實現它,並預測一個樣本
% Load Data
data = load('ex1data2.txt');
X = data(:, 1:2);
y = data(:, 3);
m = length(y);
% Scale features and set them to zero mean
[X mu sigma] = featureNormalize(X);
% Add intercept term to X
X = [ones(m, 1) X];
% Choose some alpha value
alpha = 1;
num_iters = 50;
% Init Theta and Run Gradient Descent
theta = zeros(3, 1);
[theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);
% Plot the convergence graph
figure;
plot(1:numel(J_history), J_history, '-b', 'LineWidth', 2);
xlabel('Number of iterations');
ylabel('Cost J');
% Display gradient descent's result
fprintf('Theta computed from gradient descent: \n');
fprintf(' %f \n', theta);
fprintf('\n');
% Estimate the price of a 1650 sq-ft, 3 br house
XX = [1650 3];
XX=[ones(1, 1) (XX - mu)./sigma];
price = XX *theta ;
fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...
'(using gradient descent):\n $%f\n'], price);