論文閱讀 | Abstractive Sentence Summarization with Attentive Recurrent Neural Networks

Abstractive Sentence Summarization with Attentive Recurrent Neural Networks

  • 摘要

Abstractive Sentence Summarization generates a shorter version of a given sentence while attempting to preserve its meaning.

We introduce a conditional recurrent neural network (RNN) which generates a summary of an input sentence. (我們將介紹一種條件遞歸神經網絡,其用來生成輸入句子的摘要)

The conditioning is provided by a novel convolutional attention-based encoder which ensures that the decoder focuses on the appropriate input words at each step of generation.使用一種conditional RNN來生成摘要,該條件是卷積注意力模型(convolutional attention-based encoder),用來確保每一步生成詞的時候都可以聚焦到合適的輸入上。)

Our model relies only on learned features and is easy to train in an end-to-end fashion on large data sets.模型僅僅依賴於學習到的features,並且很容易在大規模數據上進行end2end式訓練)

Our experiments show that the model significantly outperforms the recently proposed state-of-the-art method on the Gigaword corpus while performing competitively on the DUC-2004 shared task.(在Gigaword語料上和DUC-2004任務中取得了更好的效果。)

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章