幾個層的先後順序:
python - Ordering of batch normalization and dropout in TensorFlow? - Stack Overflow https://stackoverflow.com/questions/39691902/ordering-of-batch-normalization-and-dropout-in-tensorflow
(1)conv/fc->relu->dropout->batch norm->conv/fc
(2)conv/fc->batch norm->relu->dropout->conv/fc
Dropout是指在模型訓練時隨機讓網絡某些隱含層節點的權重不工作,不工作的那些節點可以暫時認爲不是網絡結構的一部分,但是它的權重得保留下來(只是暫時不更新而已),因爲下次樣本輸入時它可能又得工作了
參考:Deep learning:四十一(Dropout簡單理解) - tornadomeet - 博客園 https://www.cnblogs.com/tornadomeet/p/3258122.html