Abstractive Summarization of News Articles Using BART

Authors

  • Mohan Bikram KC Department of Computer Engineering, Gandaki College of Engineering and Science, Nepal
  • Ayush Sharma Kaundinya Department of Computer Engineering, Gandaki College of Engineering and Science, Nepal
  • Pramish Adhikari Department of Computer Engineering, Gandaki College of Engineering and Science, Nepal
  • Smita Adhikari Department of Electronics and Computer Engineering, Pashchimanchal Campus, Pokhara, Nepal

DOI:

https://doi.org/10.3126/pecj.v3i1.93538

Keywords:

Abstractive Text Summarization, BART-Large, News Articles, Natural Language Processing

Abstract

Abstractive text summarization generates concise summaries that preserve key concepts through newly constructed sentences rather than extraction from the source. As online news content grows rapidly, effective summarization systems are essential for efficient information consumption. This study investigates abstractive summarization using BART-Large, a transformer-based encoder-decoder model, with a novel training approach: fine-tuning on a combined dataset merging XSum and CNN/DailyMail. While XSum emphasizes highly abstractive single-sentence summaries and CNN/DailyMail targets longer multi-sentence outputs, combining these datasets during training aims to produce a more versatile model capable of handling diverse summarization demands. Performance is evaluated using ROUGE metrics, which measure n-gram overlap between generated and reference summaries. The fine-tuned model achieves a ROUGE-2 score of 22.86, demonstrating competitive performance against existing approaches. Qualitative analysis reveals that the model produces fluent, coherent summaries while maintaining factual consistency with source documents. These results indicate that exposing models to varied summarization demands during fine-tuning can improve flexibility without sacrificing quality. This work offers a practical direction for building more generalizable summarization systems and highlights the potential of multi-dataset training strategies for domain adaptation in natural language generation tasks.

Downloads

Download data is not yet available.
Abstract
9
PDF
4

Downloads

Published

2026-04-29

How to Cite

KC, M. B., Kaundinya, A. S., Adhikari, P., & Adhikari, S. (2026). Abstractive Summarization of News Articles Using BART. Pokhara Engineering College Journal, 3(1), 132–148. https://doi.org/10.3126/pecj.v3i1.93538

Issue

Section

Research Articles