A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents
Abstract
A hierarchical encoder and attentive discourse-aware decoder improve abstractive summarization for longer-form documents, outperforming existing models.
Neural abstractive summarization models have led to promising results in summarizing relatively short documents. We propose the first model for abstractive summarization of single, longer-form documents (e.g., research papers). Our approach consists of a new hierarchical encoder that models the discourse structure of a document, and an attentive discourse-aware decoder to generate the summary. Empirical results on two large-scale datasets of scientific papers show that our model significantly outperforms state-of-the-art models.
Get this paper in your agent:
hf papers read 1804.05685 Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash Models citing this paper 3
Datasets citing this paper 5
Browse 5 datasets citing this paperSpaces citing this paper 6
Collections including this paper 0
No Collection including this paper