論 文Papers

CONFERENCE (INTERNATIONAL)

Incorporating Topic Sentence on Neural News Headline Generation

Jan Wira Gotama Putra(Tokyo Tech), Hayato Kobayashi, Nobuyuki Shimizu

19th International Conference on Computational Linguistics and Intelligent Text Processing(CICLing 2018), 2018/3

Category:

自然言語処理 (Natural Language Processing) 機械学習 (Machine Learning)

Abstract:
Most past studies on neural news headline generation train the encoder-decoder model using the first sentence of a document aligned with a headline. However, it is found that the first sentence might not provide sufficient information. This study proposes to use a topic sentence as the input instead of the first sentence for neural news headline generation task. The topic sentence is defined as the most newsworthy sentence and has been studied in the past. Experimental result shows that the model trained on the topic sentence has a better generalization than the model trained using the first sentence. Training the model using both the first and topic sentences increases the performance even further compared to only training using the topic sentence in a certain case. We conclude that using the topic sentence is a strategy of giving a more informative information into the neural network compared to using the first sentence, while keeping the input length as short as possible at the same time.