Back close

Summarizing News: Unleashing the Power of BART

Publication Type : Conference Paper

Publisher : IEEE

Url : https://doi.org/10.1109/CONIT61985.2024.10626617

Keywords : Measurement;Analytical models;Accuracy;Law;Text summarization;Motion pictures;BART;GPT-2;NLP;T5;PEGASUS;Rouge;SacreBLEU;Text Summarization

Campus : Bengaluru

School : School of Computing

Department : Computer Science and Engineering

Year : 2024

Abstract : In today’s world, with so much information available online, it’s critical to have a method in place for efficiently and rapidly extracting that information. One important NLP activity is text summarizing, which tries to distill lengthy materials into succinct summaries while maintaining important information. The Exercise of extracting the key data from a document or group of linked papers in order to condense them into a shorter version while maintaining the content’s overall meaning. In real, summarizing a text application has a wide scope of practical applications in different domains, enabling users to efficiently process and extract insights from large volumes of textual data can be seen in tools like QuillBot, Movie platform proving gist of the movie, News Aggregation, Legal Document Analysis. The study investigate the performance of models including a comparative analysis of four models GPT-2, T5, BART and PEGASUS, for abstract generation on the widely used CNN_Daily corpus. The system’s performance will be assessed through metrics such as SacreBLEU and ROUGE metrics. The study also analyzes the pros and cons of each model and provide insights into their effectiveness in generating accurate and informative summaries.All things considered, this work advances the area of text summarization by offering a thorough examination of state-of-the-art methods on a difficult dataset.

Cite this Research Publication : M Asmitha, C.R. Kavitha, D Radha, Summarizing News: Unleashing the Power of BART, [source], IEEE, 2024, https://doi.org/10.1109/CONIT61985.2024.10626617

Admissions Apply Now