用keras實驗mnist數據

# -*- coding: utf-8 -*-
"""
Created on Mon Oct 30 19:44:02 2017

@author: user
"""

from __future__ import print_function
# 導入numpy庫, numpy是一個常用的科學計算庫,優化矩陣的運算
import numpy as np
np.random.seed(1337)


# 導入mnist數據庫, mnist是常用的手寫數字庫
# 導入順序模型
from keras.models import Sequential
# 導入全連接層Dense, 激活層Activation 以及 Dropout層
from keras.layers.core import Dense, Dropout, Activation




# 設置batch的大小
batch_size = 100
# 設置類別的個數
nb_classes = 10
# 設置迭代的次數
nb_epoch = 20

'''
下面這一段是加載mnist數據,網上用keras加載mnist數據都是用
(X_train, y_train), (X_test, y_test) = mnist.load_data()
但是我用這條語句老是出錯:OSError: [Errno 22] Invalid argument
'''
from tensorflow.examples.tutorials.mnist import input_data  
mnist = input_data.read_data_sets("MNIST_data/", one_hot=True)  

X_train, Y_train = mnist.train.images,mnist.train.labels  
X_test, Y_test = mnist.test.images, mnist.test.labels  
X_train = X_train.reshape(-1, 28, 28,1).astype('float32')  
X_test = X_test.reshape(-1,28, 28,1).astype('float32')  

#打印訓練數據和測試數據的維度
print(X_train.shape,X_test.shape,Y_train.shape,Y_test.shape)

#修改維度
X_train = X_train.reshape(55000,784)
X_test = X_test.reshape(10000,784)
print(X_train.shape,X_test.shape,Y_train.shape,Y_test.shape)



# keras中的mnist數據集已經被劃分成了55,000個訓練集,10,000個測試集的形式,按以下格式調用即可

# X_train原本是一個60000*28*28的三維向量,將其轉換爲60000*784的二維向量

# X_test原本是一個10000*28*28的三維向量,將其轉換爲10000*784的二維向量

# 將X_train, X_test的數據格式轉爲float32存儲
X_train = X_train.astype('float32')
X_test = X_test.astype('float32')
# 歸一化
X_train /= 255
X_test /= 255
# 打印出訓練集和測試集的信息
print(X_train.shape[0], 'train samples')
print(X_test.shape[0], 'test samples')

#Y_train = np_utils.to_categorical(Y_train, nb_classes)
#Y_test = np_utils.to_categorical(Y_test, nb_classes)

# 建立順序型模型
model = Sequential()
'''
模型需要知道輸入數據的shape,
因此,Sequential的第一層需要接受一個關於輸入數據shape的參數,
後面的各個層則可以自動推導出中間數據的shape,
因此不需要爲每個層都指定這個參數
''' 

# 輸入層有784個神經元
# 第一個隱層有512個神經元,激活函數爲ReLu,Dropout比例爲0.2
model.add(Dense(500, input_shape=(784,)))
model.add(Activation('relu'))
model.add(Dropout(0.2))

# 第二個隱層有512個神經元,激活函數爲ReLu,Dropout比例爲0.2
model.add(Dense(500))
model.add(Activation('relu'))
model.add(Dropout(0.2))

# 輸出層有10個神經元,激活函數爲SoftMax,得到分類結果
model.add(Dense(10))
model.add(Activation('softmax'))

# 輸出模型的整體信息
# 總共參數數量爲784*512+512 + 512*512+512 + 512*10+10 = 669706
model.summary()

model.compile(loss='categorical_crossentropy',
              optimizer='adam',
              metrics=['accuracy'])

history = model.fit(X_train, Y_train,
                    batch_size = 200,
                    epochs = 20,
                    verbose = 1,
                    validation_data = (X_test, Y_test))

score = model.evaluate(X_test, Y_test, verbose=0)


# 輸出訓練好的模型在測試集上的表現
print('Test score:', score[0])
print('Test accuracy:', score[1])

輸出結果:

Using TensorFlow backend.
Extracting MNIST_data/train-images-idx3-ubyte.gz
Extracting MNIST_data/train-labels-idx1-ubyte.gz
Extracting MNIST_data/t10k-images-idx3-ubyte.gz
Extracting MNIST_data/t10k-labels-idx1-ubyte.gz
(55000, 28, 28, 1) (10000, 28, 28, 1) (55000, 10) (10000, 10)
(55000, 784) (10000, 784) (55000, 10) (10000, 10)
55000 train samples
10000 test samples
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_28 (Dense)             (None, 500)               392500    
_________________________________________________________________
activation_28 (Activation)   (None, 500)               0         
_________________________________________________________________
dropout_19 (Dropout)         (None, 500)               0         
_________________________________________________________________
dense_29 (Dense)             (None, 500)               250500    
_________________________________________________________________
activation_29 (Activation)   (None, 500)               0         
_________________________________________________________________
dropout_20 (Dropout)         (None, 500)               0         
_________________________________________________________________
dense_30 (Dense)             (None, 10)                5010      
_________________________________________________________________
activation_30 (Activation)   (None, 10)                0         
=================================================================
Total params: 648,010.0
Trainable params: 648,010.0
Non-trainable params: 0.0
_________________________________________________________________
Train on 55000 samples, validate on 10000 samples
Epoch 1/20
55000/55000 [==============================] - 8s - loss: 0.9839 - acc: 0.7180 - val_loss: 0.4231 - val_acc: 0.8754
Epoch 2/20
55000/55000 [==============================] - 6s - loss: 0.3968 - acc: 0.8811 - val_loss: 0.3209 - val_acc: 0.9050
Epoch 3/20
55000/55000 [==============================] - 7s - loss: 0.3220 - acc: 0.9049 - val_loss: 0.2795 - val_acc: 0.9187
Epoch 4/20
55000/55000 [==============================] - 7s - loss: 0.2745 - acc: 0.9183 - val_loss: 0.2363 - val_acc: 0.9280
Epoch 5/20
55000/55000 [==============================] - 7s - loss: 0.2335 - acc: 0.9303 - val_loss: 0.1984 - val_acc: 0.9410
Epoch 6/20
55000/55000 [==============================] - 7s - loss: 0.2029 - acc: 0.9387 - val_loss: 0.1727 - val_acc: 0.9481
Epoch 7/20
55000/55000 [==============================] - 7s - loss: 0.1781 - acc: 0.9463 - val_loss: 0.1533 - val_acc: 0.9548
Epoch 8/20
55000/55000 [==============================] - 8s - loss: 0.1585 - acc: 0.9529 - val_loss: 0.1375 - val_acc: 0.9585
Epoch 9/20
55000/55000 [==============================] - 7s - loss: 0.1419 - acc: 0.9569 - val_loss: 0.1310 - val_acc: 0.9599
Epoch 10/20
55000/55000 [==============================] - 7s - loss: 0.1278 - acc: 0.9618 - val_loss: 0.1194 - val_acc: 0.9640
Epoch 11/20
55000/55000 [==============================] - 7s - loss: 0.1149 - acc: 0.9658 - val_loss: 0.1100 - val_acc: 0.9663
Epoch 12/20
55000/55000 [==============================] - 8s - loss: 0.1067 - acc: 0.9681 - val_loss: 0.1026 - val_acc: 0.9677
Epoch 13/20
55000/55000 [==============================] - 7s - loss: 0.0987 - acc: 0.9704 - val_loss: 0.0985 - val_acc: 0.9695
Epoch 14/20
55000/55000 [==============================] - 7s - loss: 0.0905 - acc: 0.9719 - val_loss: 0.0942 - val_acc: 0.9715
Epoch 15/20
55000/55000 [==============================] - 7s - loss: 0.0832 - acc: 0.9750 - val_loss: 0.0916 - val_acc: 0.9701
Epoch 16/20
55000/55000 [==============================] - 7s - loss: 0.0786 - acc: 0.9761 - val_loss: 0.0878 - val_acc: 0.9726
Epoch 17/20
55000/55000 [==============================] - 7s - loss: 0.0720 - acc: 0.9781 - val_loss: 0.0862 - val_acc: 0.9715
Epoch 18/20
55000/55000 [==============================] - 7s - loss: 0.0685 - acc: 0.9787 - val_loss: 0.0852 - val_acc: 0.9733
Epoch 19/20
55000/55000 [==============================] - 7s - loss: 0.0625 - acc: 0.9809 - val_loss: 0.0773 - val_acc: 0.9762
Epoch 20/20
55000/55000 [==============================] - 7s - loss: 0.0596 - acc: 0.9812 - val_loss: 0.0761 - val_acc: 0.9771
Test score: 0.0761470827273
Test accuracy: 0.9771
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章