Back close

Abstractive Text Summarization on Templatized Data

Publication Type : Journal Article

Publisher : Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering

Campus : Bengaluru

School : School of Computing

Year : 2021

Abstract : Abstractive text summarization generates a brief form of an input text from the original source without the sentences being reused by still preserving the meaning and the important information. This could be modelled as a sequence-to-sequence learning by exploiting Recurrent Neural Networks (RNN). Typical RNN models are tough to train owing to the vanishing and exploding gradient complications. ‘Long Short-Term Memory Networks (LSTM)’ are an answer for such vanishing gradients problem. LSTM based modelling with an attentional mechanism is vital to improve the text summarization application. The key intuition behind attention mechanism is to decide how much attention one needs to pay to every word in the input sequence for generating a word at a particular step. The objective of this paper is to study various attention models and its applicability to text summarization. The intent is to implement Abstractive text summarization with an appropriate attention model using LSTM on a templatized dataset to avoid noise and ambiguity in generating high quality summary.

Cite this Research Publication : Jyothi, C. and M. Supriya. 2021. Abstractive Text Summarization on Templatized Data. Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST. Vol. 383. doi:10.1007/978-3-030-79276-3_17.

Admissions Apply Now