tensorlayer學習日誌6_chapter3_3.5

3.5主要就是講降噪編碼器,個人感覺主要功能還是爲了對付過度擬合爲主。。。這節的畫出隱層的權值很有意思~~

電腦不行,老規矩縮水 訓練,教材是訓練n_epoch=200的,我這破機器就100好了,然後是每隔20個畫個隱層圖print_freq=20。

這裏是model='relu'的,應該還有個model='sigmoid'的,我下次去用好點的電腦再試,再對比吧~這裏就只貼下relu的吧

import tensorflow as tf
import tensorlayer as tl
import numpy as np

model = 'relu'

X_train, y_train, X_val, y_val, X_test, y_test = tl.files.load_mnist_dataset(shape=(-1, 784))

sess = tf.InteractiveSession()

# placeholder
x = tf.placeholder(tf.float32, shape=[None, 784], name='x')

print("~~~~~~~~~~~~~~~Build net~~~~~~~~~~~~~~~~~~~~~~")
if model == 'relu':
    net = tl.layers.InputLayer(x, name='input')
    net = tl.layers.DropoutLayer(net, keep=0.5, name='denoising1')  # if drop some inputs, it is denoise AE
    net = tl.layers.DenseLayer(net, n_units=196, act=tf.nn.relu, name='relu1')
    recon_layer1 = tl.layers.ReconLayer(net, x_recon=x, n_units=784, act=tf.nn.softplus, name='recon_layer1')
elif model == 'sigmoid':
    # sigmoid - set keep to 1.0, if you want a vanilla Autoencoder
    net = tl.layers.InputLayer(x, name='input')
    net = tl.layers.DropoutLayer(net, keep=0.5, name='denoising1')
    net = tl.layers.DenseLayer(net, n_units=196, act=tf.nn.sigmoid, name='sigmoid1')
    recon_layer1 = tl.layers.ReconLayer(net, x_recon=x, n_units=784, act=tf.nn.sigmoid, name='recon_layer1')

## ready to train
tl.layers.initialize_global_variables(sess)

## print all params
print("~~~~~~~~~~~All net Params~~~~~~~~~~~~~~~")
net.print_params()

## pretrain
print("~~~~~~~~~~Pre-train Layer 1~~~~~~~~~~~~~~")
recon_layer1.pretrain(
    sess, x=x, X_train=X_train, X_val=X_val, denoise_name='denoising1', n_epoch=100, batch_size=128, print_freq=20,
    save=True, save_name='w1pre_'
)
# You can also disable denoisong by setting denoise_name=None.

saver = tf.train.Saver()
# you may want to save the model
save_path = saver.save(sess, "./model_denoising1_3.4/")
print("Model saved in file: %s" % save_path)
sess.close()

運行輸出如下:

[TL] Load or Download MNIST > data\mnist
[TL] data\mnist\train-images-idx3-ubyte.gz
[TL] data\mnist\t10k-images-idx3-ubyte.gz
~~~~~~~~~~~~~~~Build net~~~~~~~~~~~~~~~~~~~~~~
[TL] InputLayer  input: (?, 784)
[TL] DropoutLayer denoising1: keep:0.500000 is_fix:False
[TL] DenseLayer  relu1: 196 relu
[TL] DenseLayer  recon_layer1: 784 softplus
[TL] recon_layer1 is a ReconLayer
[TL]      lambda_l2_w: 0.004000
[TL]      learning_rate: 0.000100
[TL]      use: mse, L2_w, L1_a
~~~~~~~~~~~All net Params~~~~~~~~~~~~~~~
[TL]   param   0: relu1/W:0            (784, 196)         float32_ref (mean: 0.000326054374454543, median: 0.0003588348627090454, std: 0.08798697590827942)   
[TL]   param   1: relu1/b:0            (196,)             float32_ref (mean: 0.0               , median: 0.0               , std: 0.0               )   
[TL]   num of params: 153860
~~~~~~~~~~Pre-train Layer 1~~~~~~~~~~~~~~
[TL]      [*] recon_layer1 start pretrain
[TL]      batch_size: 128
[TL]      denoising layer keep: 0.500000
[TL] Epoch 1 of 100 took 11.146820s
[TL]    train loss: 67.451898
[TL]    val loss: 65.987557
[TL] [*] w1pre_1.npz saved
[TL] Epoch 20 of 100 took 10.789619s
[TL]    train loss: 17.309017
[TL]    val loss: 17.203405
C:\Program Files\Anaconda3\lib\site-packages\matplotlib\cbook\deprecation.py:107: MatplotlibDeprecationWarning: Adding an axes using the same arguments as a previous axes currently reuses the earlier instance.  In a future version, a new instance will always be created and returned.  Meanwhile, this warning can be suppressed, and the future behavior ensured, by passing a unique label to each axes instance.
  warnings.warn(message, mplDeprecation, stacklevel=1)
[TL] [*] w1pre_20.npz saved
[TL] Epoch 40 of 100 took 10.814819s
[TL]    train loss: 11.773568
[TL]    val loss: 11.801825
[TL] [*] w1pre_40.npz saved
[TL] Epoch 60 of 100 took 12.611034s
[TL]    train loss: 10.263730
[TL]    val loss: 10.318127
[TL] [*] w1pre_60.npz saved
[TL] Epoch 80 of 100 took 10.957020s
[TL]    train loss: 9.595004
[TL]    val loss: 9.660938
[TL] [*] w1pre_80.npz saved
[TL] Epoch 100 of 100 took 11.122819s
[TL]    train loss: 9.268205
[TL]    val loss: 9.332822
[TL] [*] w1pre_100.npz saved
Model saved in file: ./model_denoising1/
[Finished in 1210.7s]

 開始時的隱層權值圖

 第20個,看得出已經有東東了

40個,更清晰了些

 60個 感覺在減少無關的,主要特徵在增加

80個,圖片 越來越乾淨了

100個最終的,不知道 用好機器,搞個幾 千,會是什麼樣喔~ 真想試試

存了好多文件,npz是隱層權值圖像數值的表示,順便還輸出了個model

 

 

 

 

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章