UFLDL Exercise: Softmax Regression

這是UFLDL關於softmax迴歸的練習題。

練習主要編寫softmaxCost.m和softmaxPredict.m兩個文件。

softmaxCost.m

function [cost, grad] = softmaxCost(theta, numClasses, inputSize, lambda, data, labels)

% numClasses - the number of classes 
% inputSize - the size N of the input vector
% lambda - weight decay parameter
% data - the N x M input matrix, where each column data(:, i) corresponds to
%        a single test set
% labels - an M x 1 matrix containing the labels corresponding for the input data
%

% Unroll the parameters from theta
theta = reshape(theta, numClasses, inputSize);

numCases = size(data, 2);

groundTruth = full(sparse(labels, 1:numCases, 1));
%cost = 0;

%thetagrad = zeros(numClasses, inputSize);

%% ---------- YOUR CODE HERE --------------------------------------
%  Instructions: Compute the cost and gradient for softmax regression.
%                You need to compute thetagrad and cost.
%                The groundTruth matrix might come in handy.
M = theta * data;
M = bsxfun(@minus, M, max(M, [], 1));
M = exp(M);
M = bsxfun(@rdivide, M, sum(M));
thetagrad = (groundTruth-M)*data'/(-numCases) + lambda*theta;
cost = groundTruth(:)'*log(M(:))/(-numCases) + sum(theta(:).^2)*lambda/2;
% ------------------------------------------------------------------
% Unroll the gradient matrices into a vector for minFunc
grad = [thetagrad(:)];
end

softmaxPredict.m

function [pred] = softmaxPredict(softmaxModel, data)

% softmaxModel - model trained using softmaxTrain
% data - the N x M input matrix, where each column data(:, i) corresponds to
%        a single test set
%
% Your code should produce the prediction matrix 
% pred, where pred(i) is argmax_c P(y(c) | x(i)).
 
% Unroll the parameters from theta
theta = softmaxModel.optTheta;  % this provides a numClasses x inputSize matrix
pred = zeros(1, size(data, 2));

%% ---------- YOUR CODE HERE --------------------------------------
%  Instructions: Compute pred using theta assuming that the labels start 
%                from 1.
%numClasses = softmaxModel.numClasses;
%inputSize = softmaxModel.inputSize;
%theta = reshape(theta, numClasses, inputSize);
M = theta*data;
M = bsxfun(@minus,M,max(M,[],1));
M = exp(M);
M = bsxfun(@rdivide,M,sum(M));
[maxv,pred] = max(M);
% ---------------------------------------------------------------------

end

迭代100次,Accuracy: 92.640%。

參考:

[1]http://deeplearning.stanford.edu/wiki/index.php/Softmax迴歸

[2]http://deeplearning.stanford.edu/wiki/index.php/Exercise:Softmax_Regression

[3]http://www.cnblogs.com/tornadomeet/archive/2013/03/23/2977621.html【參考此文章修改了部分實現,加快了訓練速度】

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章