3.7 softmax迴歸的簡潔實現

代碼實現

import tensorflow as tf
from tensorflow import keras

fashion_mnist = keras.datasets.fashion_mnist
(x_train, y_train), (x_test, y_test) = fashion_mnist.load_data()

# 模型歸一化,便於訓練
x_train = x_train / 255.0
x_test = x_test / 255.0

model = keras.Sequential([ # 定義 Sequential 實例
    keras.layers.Flatten(input_shape=(28, 28)), # 第一層是Flatten全連接層,將28 * 28的像素值,壓縮成一行 (784, )
    keras.layers.Dense(10, activation=tf.nn.softmax) # 全連接神經網絡層,多分類問題,激活函數使用softmax
])

# 分開定義softmax運算和交叉熵損失函數可能會造成數值不穩定,因此,Tensorflow2.0的keras API提供了一個loss參數
loss = 'sparse_categorical_crossentropy'

# 定義優化算法,學習率爲0.1
optimizer = tf.keras.optimizers.SGD(0.1)

# 訓練模型
model.compile(optimizer=tf.keras.optimizers.SGD(0.1),
              loss = 'sparse_categorical_crossentropy',
              metrics=['accuracy'])

model.fit(x_train,y_train,epochs=5,batch_size=256)

test_loss, test_acc = model.evaluate(x_test, y_test)
print('Test Acc:',test_acc)

輸出

Train on 60000 samples
Epoch 1/5

  256/60000 [..............................] - ETA: 40s - loss: 2.4577 - accuracy: 0.0742
 7424/60000 [==>...........................] - ETA: 1s - loss: 1.3647 - accuracy: 0.5626 
14336/60000 [======>.......................] - ETA: 0s - loss: 1.1397 - accuracy: 0.6354
20480/60000 [=========>....................] - ETA: 0s - loss: 1.0314 - accuracy: 0.6689
27136/60000 [============>.................] - ETA: 0s - loss: 0.9558 - accuracy: 0.6925
33536/60000 [===============>..............] - ETA: 0s - loss: 0.9094 - accuracy: 0.7068
40448/60000 [===================>..........] - ETA: 0s - loss: 0.8708 - accuracy: 0.7181
47616/60000 [======================>.......] - ETA: 0s - loss: 0.8365 - accuracy: 0.7295
54784/60000 [==========================>...] - ETA: 0s - loss: 0.8098 - accuracy: 0.7373
60000/60000 [==============================] - 1s 10us/sample - loss: 0.7937 - accuracy: 0.7428
Epoch 2/5

  256/60000 [..............................] - ETA: 0s - loss: 0.5395 - accuracy: 0.8320
 6912/60000 [==>...........................] - ETA: 0s - loss: 0.6071 - accuracy: 0.8015
13568/60000 [=====>........................] - ETA: 0s - loss: 0.6007 - accuracy: 0.8013
20480/60000 [=========>....................] - ETA: 0s - loss: 0.5974 - accuracy: 0.8012
27392/60000 [============>.................] - ETA: 0s - loss: 0.5957 - accuracy: 0.8025
33536/60000 [===============>..............] - ETA: 0s - loss: 0.5892 - accuracy: 0.8050
39936/60000 [==================>...........] - ETA: 0s - loss: 0.5825 - accuracy: 0.8077
46080/60000 [======================>.......] - ETA: 0s - loss: 0.5790 - accuracy: 0.8093
53248/60000 [=========================>....] - ETA: 0s - loss: 0.5738 - accuracy: 0.8106
60000/60000 [==============================] - 0s 8us/sample - loss: 0.5731 - accuracy: 0.8110
Epoch 3/5

  256/60000 [..............................] - ETA: 0s - loss: 0.5315 - accuracy: 0.8438
 7168/60000 [==>...........................] - ETA: 0s - loss: 0.5486 - accuracy: 0.8143
14080/60000 [======>.......................] - ETA: 0s - loss: 0.5411 - accuracy: 0.8186
20992/60000 [=========>....................] - ETA: 0s - loss: 0.5417 - accuracy: 0.8194
27392/60000 [============>.................] - ETA: 0s - loss: 0.5388 - accuracy: 0.8213
34048/60000 [================>.............] - ETA: 0s - loss: 0.5338 - accuracy: 0.8220
41216/60000 [===================>..........] - ETA: 0s - loss: 0.5317 - accuracy: 0.8231
48640/60000 [=======================>......] - ETA: 0s - loss: 0.5271 - accuracy: 0.8250
55808/60000 [==========================>...] - ETA: 0s - loss: 0.5264 - accuracy: 0.8252
60000/60000 [==============================] - 0s 7us/sample - loss: 0.5273 - accuracy: 0.8245
Epoch 4/5

  256/60000 [..............................] - ETA: 0s - loss: 0.5978 - accuracy: 0.7852
 7424/60000 [==>...........................] - ETA: 0s - loss: 0.5154 - accuracy: 0.8229
14848/60000 [======>.......................] - ETA: 0s - loss: 0.5133 - accuracy: 0.8227
21760/60000 [=========>....................] - ETA: 0s - loss: 0.5081 - accuracy: 0.8269
28416/60000 [=============>................] - ETA: 0s - loss: 0.5054 - accuracy: 0.8285
35328/60000 [================>.............] - ETA: 0s - loss: 0.5054 - accuracy: 0.8281
42496/60000 [====================>.........] - ETA: 0s - loss: 0.5034 - accuracy: 0.8295
49920/60000 [=======================>......] - ETA: 0s - loss: 0.5041 - accuracy: 0.8301
56576/60000 [===========================>..] - ETA: 0s - loss: 0.5036 - accuracy: 0.8304
60000/60000 [==============================] - 0s 7us/sample - loss: 0.5035 - accuracy: 0.8309
Epoch 5/5

  256/60000 [..............................] - ETA: 0s - loss: 0.4797 - accuracy: 0.8398
 7424/60000 [==>...........................] - ETA: 0s - loss: 0.4974 - accuracy: 0.8303
15104/60000 [======>.......................] - ETA: 0s - loss: 0.4868 - accuracy: 0.8359
21760/60000 [=========>....................] - ETA: 0s - loss: 0.4892 - accuracy: 0.8352
28672/60000 [=============>................] - ETA: 0s - loss: 0.4872 - accuracy: 0.8356
35072/60000 [================>.............] - ETA: 0s - loss: 0.4883 - accuracy: 0.8359
41984/60000 [===================>..........] - ETA: 0s - loss: 0.4875 - accuracy: 0.8354
49664/60000 [=======================>......] - ETA: 0s - loss: 0.4874 - accuracy: 0.8359
57344/60000 [===========================>..] - ETA: 0s - loss: 0.4870 - accuracy: 0.8360
60000/60000 [==============================] - 0s 7us/sample - loss: 0.4871 - accuracy: 0.8359

   32/10000 [..............................] - ETA: 13s - loss: 0.5516 - accuracy: 0.8125
 2560/10000 [======>.......................] - ETA: 0s - loss: 0.4798 - accuracy: 0.8352 
 5184/10000 [==============>...............] - ETA: 0s - loss: 0.5117 - accuracy: 0.8266
 7840/10000 [======================>.......] - ETA: 0s - loss: 0.5141 - accuracy: 0.8231
10000/10000 [==============================] - 0s 24us/sample - loss: 0.5126 - accuracy: 0.8241
Test Acc: 0.8241
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章