UFLDL——Exercise: PCA in 2D 主成分分析

實驗要求可以參考deeplearning的tutorial, Exercise: PCA in 2D

 

1. 實驗描述:

實驗在二維數據上進行PCA降維PCA白化處理,以及ZCA白化處理,原理可以參考之間的博客,下面直接貼代碼。

 

在實驗中,我計算了每一次原始數據,PCA旋轉,PCA白化處理,以及ZCA白化處理後的協方差矩陣,結果爲:


計算協方差我使用了matlab自帶的cov(x),它要求矩陣x的每一行代表一個數據,這個tutorial實驗說明求得的協方差矩陣的結果不同,爲matlab計算的時候需要除以n-1(n爲數據的個數),但這些對最後的結果都是沒有影響的。


2. 源代碼

close all
 
%%================================================================
%% Step 0: Load data
%  We have provided the code to load data from pcaData.txt into x.
%  x is a 2 * 45 matrix, where the kth column x(:,k) corresponds to
%  the kth data point.Here we provide the code to load natural image data into x.
%  You do not need to change the code below.
 
x = load('pcaData.txt','-ascii');
figure(1);
scatter(x(1, :), x(2, :));
title('Raw data');
 
original = cov(x')%*(size(x,2)-1)
 
 
%%================================================================
%% Step 1a: Implement PCA to obtain U 
%  Implement PCA to obtain the rotation matrix U, which is the eigenbasis
%  sigma. 
 
% -------------------- YOUR CODE HERE -------------------- 
u = zeros(size(x, 1)); % You need to compute this
 
sigma = x*x'/size(x, 2);
 
[u,s,v] = svd(sigma);  
 
% -------------------------------------------------------- 
hold on
plot([0 u(1,1)], [0 u(2,1)]);
plot([0 u(1,2)], [0 u(2,2)]);
scatter(x(1, :), x(2, :));
hold off
 
%%================================================================
%% Step 1b: Compute xRot, the projection on to the eigenbasis
%  Now, compute xRot by projecting the data on to the basis defined
%  by U. Visualize the points by performing a scatter plot.
 
% -------------------- YOUR CODE HERE -------------------- 
xRot = zeros(size(x)); % You need to compute this
xRot = u'*x; 
 
rotate = cov(xRot')%*(size(x,2)-1)
 
% -------------------------------------------------------- 
 
% Visualise the covariance matrix. You should see a line across the
% diagonal against a blue background.
figure(2);
scatter(xRot(1, :), xRot(2, :));
title('xRot');
 
%%================================================================
%% Step 2: Reduce the number of dimensions from 2 to 1. 
%  Compute xRot again (this time projecting to 1 dimension).
%  Then, compute xHat by projecting the xRot back onto the original axes 
%  to see the effect of dimension reduction
 
% -------------------- YOUR CODE HERE -------------------- 
k = 1; % Use k = 1 and project the data onto the first eigenbasis
xHat = zeros(size(x)); % You need to compute this
 
xTiled = zeros(size(x));
xTiled(1:k,:) = xRot(1:k,:); 
xHat = u*xTiled;
 
 
 
% -------------------------------------------------------- 
figure(3);
scatter(xHat(1, :), xHat(2, :));
title('xHat');
 
 
%%================================================================
%% Step 3: PCA Whitening
%  Complute xPCAWhite and plot the results.
 
epsilon = 1e-5;
% -------------------- YOUR CODE HERE -------------------- 
xPCAWhite = zeros(size(x)); % You need to compute this
epsilon = 1e-5;  
xPCAWhite = diag(1./sqrt(diag(s) + epsilon)) * xRot;
 
PCAWhite = cov(xPCAWhite')%*(size(x,2)-1)
 
% -------------------------------------------------------- 
figure(4);
scatter(xPCAWhite(1, :), xPCAWhite(2, :));
title('xPCAWhite');
 
%%================================================================
%% Step 3: ZCA Whitening
%  Complute xZCAWhite and plot the results.
 
% -------------------- YOUR CODE HERE -------------------- 
xZCAWhite = zeros(size(x)); % You need to compute this
xZCAWhite = u * xPCAWhite; 
 
ZCAWhite = cov(xZCAWhite')%*(size(x,2)-1)
 
% -------------------------------------------------------- 
figure(5);
scatter(xZCAWhite(1, :), xZCAWhite(2, :));
title('xZCAWhite');
 
%% Congratulations! When you have reached this point, you are done!
%  You can now move onto the next PCA exercise. :)



發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章