[Exercise 4] Regularization

在這個練習中,會實現加入正則的線性迴歸和邏輯迴歸。數據:ex5Data.zip,數據擬合的過程中很容易出現的問題就是過擬合(over fitting),所以需要正則化來進行模型的選擇。

實驗基礎

Regularized linear regression

五階多項式
這裏寫圖片描述
最小化的損失函數:
這裏寫圖片描述
回想一下我們前面的Normal equations:
這裏寫圖片描述
這裏寫圖片描述

Regularized logistic regression

假設函數:
這裏寫圖片描述
看看 x 這裏有28-feature vector:
這裏寫圖片描述
加入正則的損失函數:
這裏寫圖片描述
再看看Newton’s method的變化
update rule:
這裏寫圖片描述

the gradient θ(J) and the Hessian H
這裏寫圖片描述
這裏參數說明很重要:
這裏寫圖片描述

實驗結果和程序

線性的正則化迴歸結果,這裏的λ 符號顯示不太對…額
這裏寫圖片描述
邏輯迴歸正則:
這裏寫圖片描述
這裏寫圖片描述
這裏寫圖片描述

% Regularized linear regression
clc,clear

x = load('ex5Linx.dat');
y = load('ex5Liny.dat');
% plot the data
plot(x,y,'o','MarkerFaceColor','r')

x = [ones(size(x,1),1), x, x.^2, x.^3, x.^4, x.^5];
[m,n] = size(x);

diag_m = diag([0;ones(n-1,1)]);

lambda = [0 1 10]';

colortype = {'g', 'b', 'r'};

theta =zeros(n,3)
xrange = linspace(min(x(:,2)), max(x(:,2)))';
hold on
% normal equations
for lambda_i = 1:length(lambda)
    theta(:,lambda_i) = inv(x'*x + lambda(lambda_i).*diag_m)*x'*y;
    yrange = [ones(size(xrange)) xrange xrange.^2 xrange.^3 xrange.^4 xrange.^5]*theta(:,lambda_i);
    plot(xrange',yrange,char(colortype(lambda_i)))
    hold on
end
legend('traning data', '\lambda=0', '\lambda=1', '\lambda=10')
hold off

Logi.m

% Regularized logistic regression
clc,clear

x = load('ex5Logx.dat');
y = load('ex5Logy.dat');

% define sigmod function
g = inline('1.0 ./  (1.0 + exp(-z))');
% Initialize fitting paremeters
x = map_feature(x(:,1), x(:,2));
[m, n] = size(x);
lambda = [0 1 10];
theta = zeros(n, 1);
MAX_ITR = 15;
J = zeros(MAX_ITR, 1);
% Newton's method
for lambda_i = 1:length(lambda)
    x_plot = load('ex5Logx.dat');
    y_plot = load('ex5Logy.dat');
    figure
    % Find the indices for the 2 classes
    pos = find(y_plot); neg = find(y_plot == 0);

    plot(x_plot(pos, 1), x_plot(pos, 2), '+')
    hold on
    plot(x_plot(neg, 1), x_plot(neg, 2), 'o','MarkerFaceColor', 'y')


    for i = 1:MAX_ITR
        z = x*theta;
        h = g(z);
        J(i) = (1/m).*sum(-y.*log(h) - (1 - y).*log(1 - h)) + (lambda(lambda_i)/(2*m))*norm(theta([2:end]))^2;
        % Calculate gradient and hessian
        G = (lambda(lambda_i)/m).*theta; G(1) = 0;
        L = (lambda(lambda_i)/m)*eye(n); L(1) = 0;
        grad = ((1/m).*x'*(h - y)) + G;
        H = ((1/m).*x'*diag(h)*diag(1 - h)*x) + L;

        theta = theta - H\grad;
    end
    % Plot
    J;
    norm_theta = norm(theta);
    % Define the range of the grid
    u = linspace(-1, 1.5, 200);
    v = linspace(-1, 1.5, 200);
    %Initialize space for the values to be plotted
    z = zeros(length(u), length(v));
    % Evaluate z = theta*x over the grid
    for k = 1:length(u)
        for j = 1:length(v)
            z(k,j) = map_feature(u(k),v(j))*theta;
        end
    end
    z = z';
    contour(u,v,z, [0, 0], 'LineWidth',2)
    legend('y=1', 'y=0', 'Decision boundary');
    title(sprintf('\\lambda = %g', lambda(lambda_i)), 'FontSize', 14)
    hold off

end

map_feature.m

function out = map_feature(feat1, feat2)
% MAP_FEATURE    Feature mapping function for Exercise 5
%
%   map_feature(feat1, feat2) maps the two input features
%   to higher-order features as defined in Exercise 5.
%
%   Returns a new feature array with more features
%
%   Inputs feat1, feat2 must be the same size
%
% Note: this function is only valid for Ex 5, since the degree is
% hard-coded in.
    degree = 6;
    out = ones(size(feat1(:,1)));
    for i = 1:degree
        for j = 0:i
            out(:, end+1) = (feat1.^(i-j)).*(feat2.^j);
        end
    end

Reference

  1. http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=DeepLearning&doc=exercises/ex5/ex5.html
  2. http://blog.csdn.net/sherlockzoom/article/details/46890817
  3. http://www.cnblogs.com/tornadomeet/archive/2013/03/17/2964515.html
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章