lstm是怎麼解決梯度消失問題的

https://www.quora.com/How-does-LSTM-help-prevent-the-vanishing-and-exploding-gradient-problem-in-a-recurrent-neural-network#:~:text=The%20vanishing%20(and%20exploding)%20gradient%20problem%20is%20caused%20by%20the,controlled%20by%20the%20forget%20gate.&text=Compared%20with%20LSTM%2C%20it%20is,be%20stacked%20into%20deep%20models.

這篇文章,從直覺的角度講解了下,我覺得比較容易理解。

 

the LSTM architecture is resistant to exploding and vanishing gradients, although mathematically both phenomena are still possible.

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章