论文阅读 | Abstractive Sentence Summarization with Attentive Recurrent Neural Networks

Abstractive Sentence Summarization with Attentive Recurrent Neural Networks

  • 摘要

Abstractive Sentence Summarization generates a shorter version of a given sentence while attempting to preserve its meaning.

We introduce a conditional recurrent neural network (RNN) which generates a summary of an input sentence. (我们将介绍一种条件递归神经网络,其用来生成输入句子的摘要)

The conditioning is provided by a novel convolutional attention-based encoder which ensures that the decoder focuses on the appropriate input words at each step of generation.使用一种conditional RNN来生成摘要,该条件是卷积注意力模型(convolutional attention-based encoder),用来确保每一步生成词的时候都可以聚焦到合适的输入上。)

Our model relies only on learned features and is easy to train in an end-to-end fashion on large data sets.模型仅仅依赖于学习到的features,并且很容易在大规模数据上进行end2end式训练)

Our experiments show that the model significantly outperforms the recently proposed state-of-the-art method on the Gigaword corpus while performing competitively on the DUC-2004 shared task.(在Gigaword语料上和DUC-2004任务中取得了更好的效果。)

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章