機器學習(MACHINE LEARNING)MATLAB三層神經網絡的簡單應用


項目地址!!!!!!!!!!!!!!!!!!!!!!!!

1 目的

進行1-100以內的質數分類

2 設計思路

1、生成1-100以內的數和對應二進制

2、對質數部分進行label爲1,其餘爲0

3、選擇前60組作爲training數據,後40組testing

4、選擇三層神經網絡,其中hidden和output部分使用sigmoid函數

3 代碼

1、測試數據生成函數

function f = dataset_generator

bits_num = 7;
prime_table = [2,3,5,7,11,13,17,19,23,29,31,37,41,43,47,53,59,61,67,71,73,79,83,89,97];
prime_numble = 25;
prime_dataset = zeros(100,8);

%Generate prime number dataset 1-100
for count = 1:100
bin_str = dec2bin(count,bits_num);
for i = 1:bits_num
prime_dataset(count,i) = str2num(bin_str(i));
end
for i = 1:prime_numble
if(count == prime_table(i))
prime_dataset(count,bits_num+1) = 1;
end
end
if(prime_dataset(count,bits_num+1)~=1)
prime_dataset(count,bits_num+1) = 0;
end
end

f = prime_dataset;

2、最優學習速率選擇函數,當hidden neurons 的數量分別選擇8和15時獲得alpha 最優爲1和0.1,ask why?
function test_script1

%Training Set
data = dataset_generator;
x_train = data(1:60,1:7);
y_train = data(1:60,8);
x_test = data(61:100,1:7);
y_test = data(61:100,8);

for pow_num = 1:5

%Learning Rate
alpha = 10^(-3+pow_num);

%Initialize the network
syn0 = 2*rand(7,15)-1;
syn1 = 2*rand(15,1)-1;

%Training the network
for i = 1:60000
    l0 = x_train;
    l1 = sigmoid(l0*syn0);
    l2 = sigmoid(l1*syn1);
    l2_error = l2 - y_train;
    if(i==1)
        overallerror(i) = mean(abs(l2_error));
    end
    if(mod(i,10000)==0)
        overallerror(i/10000+1) = mean(abs(l2_error));
    end
    l2_delta = l2_error.*sigmoid_derivation(l2);
    l1_error = l2_delta*syn1';
    l1_delta = l1_error.*sigmoid_derivation(l1);
    syn1 = syn1 - alpha*(l1'*l2_delta);
    syn0 = syn0 - alpha*(l0'*l1_delta);
end
alpha
overallerror

end

%Testing progress
%testing_output = sigmoid(sigmoid(x_test*syn0)*syn1)
%testing_error = sum(abs(y_test - testing_output))

function s = sigmoid (x)
[m,n] = size(x);
for i = 1:m
for j = 1:n
s(i,j) = 1/(1+exp(-x(i,j)));
end
end

function s = sigmoid_derivation(x)
s = x.*(1-x);

3、主程序,包括數據生成、訓練測試數據選擇、網絡訓練、網絡測試、結果比較
function test_script

%Training Set
data = dataset_generator;
x_train = data(1:60,1:7);
y_train = data(1:60,8);
x_test = data(61:100,1:7);
y_test = data(61:100,8);

%According to result of “test_script1.m”
%Learning rate
%“alpha = 1” --------- “number of hidden neurons = 8”
%“alpha = 0.1” --------- “number of hidden neurons = 15”
alpha = 0.1;

%Initialize the network
syn0 = 2rand(7,15)-1;
syn1 = 2
rand(15,1)-1;

%Training the network
for i = 1:60000
l0 = x_train;
l1 = sigmoid(l0syn0);
l2 = sigmoid(l1
syn1);
l2_error = l2 - y_train;
if(i==1)
overallerror(i) = mean(abs(l2_error));
end
if(mod(i,10000)==0)
overallerror(i/10000+1) = mean(abs(l2_error));
end
l2_delta = l2_error.sigmoid_derivation(l2);
l1_error = l2_delta
syn1’;
l1_delta = l1_error.sigmoid_derivation(l1);
syn1 = syn1 - alpha
(l1’l2_delta);
syn0 = syn0 - alpha
(l0’*l1_delta);
end
overallerror

%Testing progress
testing_output = sigmoid(sigmoid(x_test*syn0)*syn1);
testing_output = round(testing_output);
testing_error = sum(abs(y_test - testing_output))
for cnt = 61:100
testing_output(cnt-60,2) = cnt;
end
testing_output

function s = sigmoid (x)
[m,n] = size(x);
for i = 1:m
for j = 1:n
s(i,j) = 1/(1+exp(-x(i,j)));
end
end

function s = sigmoid_derivation(x)
s = x.*(1-x);

4 輸出

結果分析與後續工作:和之前簡單的奇偶數判別結果不同,誤差率非常大,是否因爲素數的非線性特性決定?神經網絡的結果怎麼與數學建立聯繫?
在這裏插入圖片描述

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章