

Abstractive Book Summarization Using Natural Language Processing
Abstract
Summarizing large text content is a major challenge in the field of natural language processing (NLP). This project delves into the realm of [1] NLP by exploring the application of the Pegasus model, a state-of-the-art deep learning architecture for book summarization. Pegasus, known for its ability to produce abstract summaries, offers a promising solution for summarizing complex book stories. The project involves fine- tuning the Pegasus model on a diverse set of books spanning various genres and topics. Using advanced NLP techniques, Pegasus carefully analyses the complexity of each book’s text and extracts key themes, ideas and arguments. Thanks to its advanced understanding of language semantics, Pegasus creates consistent and informative summaries that reflect the essence of the source text. The practical implications of Pegasus-based book summarization systems are also explored in real NLP applications. Pegasus summary systems have enormous potential to facilitate the collection and dissemination of information in various fields, from educational tools to content curatives. Finally, this project highlights the transformative role of [2] NLP technologies in solving the challenges exemplified by the Pegasus model from the summary of the text. Using Pegasus’ NLP capabilities, this project advances the field of automatic summarization and demonstrates the promise of AI-based approaches to unlocking the rich information encapsulated in books.
References
Rahul, S. Adhikari, and Monika, “Nlp based machine learning ap-proaches for text summarization,” in 2020 Fourth International Con-ferenceonComputingMethodologiesandCommunication(ICCMC),
pp.535–538,2020.
S. Joseph, K. Sedimo, F. Kaniwa, H. Hlomani, and K. Letsholo,“Naturallanguageprocessing:Areview,”NaturalLanguageProcessing:A Review, vol. 6, pp. 207–210, 03 2016.
M. Patel, A. Pancholi, D. Jain, and R. Jain, “A review paper onapplications of natural language processing-transformation from data-driven to intelligence-driven 1 aditi pancholi,” vol. 10, pp. 791–796, 06
S. Porwal, L. Bewoor, and V. Deshpande, “Transformer based imple-mentation for automatic book summarization,” International Journal ofIntelligentSystemsandApplicationsinEngineering,vol.10,p.123–128,
Dec.2022.
P. Howlader, P. Paul, M. Madavi, L. Bewoor, and V. Deshpande, “Finetuning transformer based bert model for generating the automatic booksummary,” International Journal on Recent and Innovation Trends inComputing and Communication, vol. 10, pp. 347–352, 12 2022.
C.Varagantham,J.Srinija,Y.Uday,K.Madhumitha,andD.P.V.Rao,“Textsummarizationusingnlp,”JournalofEmergingTechnologiesandInnovative Research, vol. 9, p. c751–c756, May 2022.
P. Sethi, S. Sonawane, S. Khanwalker, and R. B. Keskar, “Automatictext summarization of news articles,” in 2017 International Conferenceon Big Data, IoT and Data Science (BID), pp. 23–29, 2017.
O¨yku¨BerfinMercan,S.N.Cavsak,A.Deliahmetoglu,andS.Tanberk,“Abstractive text summarization for resumes with cutting edge nlptransformers and lstm,” 2023.
J. Zhang, Y. Zhao, M. Saleh, and P. J. Liu, “Pegasus: Pre-training withextracted gap-sentences for abstractive summarization,” 2020.
A. Rahali and M. A. Akhloufi, “End-to-end transformer-based modelsin textual-based nlp,” AI, vol. 4, p. 54–110, Jan 2023.
Refbacks
- There are currently no refbacks.