Knowledge-Infused Corpus Building for Context-Aware Summarization with Bert Model

Authors

  • M. Jeyakarthic
  • A. Leoraj

Abstract

In the era of information overload, the demand for effective summarization techniques that capture the nuanced context of diverse textual content is paramount. This research introduces a novel approach for Context-Aware Summarization of Knowledge-based BERT (Bidirectional Encoder Representations from Transformers). The proposed methodology leverages BERT for generating contextually rich embeddings from text, while simultaneously incorporating structured knowledge from a domain-specific corpus. This approach involves the meticulous construction of a corpus through data pre-processing. The methodology is validated using the CNN/DailyMail dataset, encompassing diverse news articles. Evaluation metrics such as ROUGE are employed to assess the quality of generated summaries. The results showcase the potential of this integrated approach in enhancing the context-awareness and informativeness of generated summaries, thereby contributing to the advancement of natural language processing and information retrieval systems. This research also contributes to the broader exploration of knowledge-aware models for enriching text analysis in the realm of news articles.

Metrics

Metrics Loading ...

Downloads

Published

2024-02-02

How to Cite

Jeyakarthic, M. ., & Leoraj, A. . (2024). Knowledge-Infused Corpus Building for Context-Aware Summarization with Bert Model . Migration Letters, 21(S4), 1681–1694. Retrieved from https://migrationletters.com/index.php/ml/article/view/7588

Issue

Section

Articles