Bi-STAB: Enhanced abstractive text summarization model using hybridized deep learning network

Autores

  • Vanathi Dhinakaran Nandha Engineering College
  • Abdullah Baihan King Saud University https://orcid.org/0000-0003-4407-7790
  • Ahmed Alhussen Majmaah University
  • Radhakrishnan Paramasivan SR University

DOI:

https://doi.org/10.4025/actascitechnol.v47i1.71378

Resumo

Abstractive text summarization is based on Sequence-to-Sequence architecture which is developed to capture syntactic and contextual information between words. Abstractive summarization is needed to address information overload and enhance summary generation. However, it faces challenges such as maintaining consistency, capturing subtle nuances, and striking a balance between conciseness and comprehensiveness. To overcome these challenges, a novel Bi-directional Stacked GRU-LSTM for ABstractive text summarization (Bi-STAB) framework is proposed to generate an effective text summarization system using a hybridized deep learning network. Initially, the original document is pre-processed and fed to the hybridized stacked deep learning network for relevant feature extraction. The hybridized deep learning network extracts features from the text by capturing both forward and backward sequential dependencies by ensuring contextual information from both directions. An Attention-based Neural Network is used to select the most relevant parts of the text to form a coherent summary from these extracted features. Finally, the summary is then post-processed to enhance readability and provide a concise and accurate representation of the original text. A simulation of Bi-STAB is performed using MATLAB, and validation is performed using WCEP and Giga Word data. TBy comparing the accuracy of Bi-STAB with existing techniques such as DeepSumm, MOOTweetSumm, and WL-AttenSumm the proposed framework outperforms these techniques by 84.80, 88.35, 92.17, and 97.64%, respectively.

Downloads

Não há dados estatísticos.

Referências

Aliakbarpour, H., Manzuri, M. T., & Rahmani, A. M. (2022). Improving the readability and saliency of abstractive text summarization using combination of deep neural networks equipped with auxiliary attention mechanism. The Journal of Supercomputing, 78(2), 2528–2555. https://doi.org/10.1007/s11227-021-03950-x DOI: https://doi.org/10.1007/s11227-021-03950-x

Alomari, A., Al-Shamayleh, A. S., Idris, N., Sabri, A. Q. M., Alsmadi, I., & Omary, D. (2023). Warm-starting for improving the novelty of abstractive summarization. IEEE Access. https://doi.org/10.1109/ACCESS.2023.3322226 DOI: https://doi.org/10.2139/ssrn.4156574

Alzubi, J. A., Jain, R., Kathuria, A., Khandelwal, A., Saxena, A., & Singh, A. (2020). Paraphrase identification using collaborative adversarial networks. Journal of Intelligent & Fuzzy Systems, 39(1), 1021–1032. https://doi.org/10.3233/JIFS-191933 DOI: https://doi.org/10.3233/JIFS-191933

Anand, D., & Wagh, R. (2022). Effective deep learning approaches for summarization of legal texts. Journal of King Saud University - Computer and Information Sciences, 34(5), 2141–2150. https://doi.org/10.1016/j.jksuci.2019.11.015 DOI: https://doi.org/10.1016/j.jksuci.2019.11.015

Cheng, J., Zhang, F., & Guo, X. (2020). A syntax-augmented and headline-aware neural text summarization method. IEEE Access, 8, 218360–218371. https://doi.org/10.1109/ACCESS.2020.3042886 DOI: https://doi.org/10.1109/ACCESS.2020.3042886

Deng, J., Cheng, L., & Wang, Z. (2021). Attention-based BiLSTM fused CNN with gating mechanism model for Chinese long text classification. Computer Speech & Language, 68, 101182. https://doi.org/10.1016/j.csl.2020.101182 DOI: https://doi.org/10.1016/j.csl.2020.101182

Dilawari, A., Khan, M. U. G., Saleem, S., & Shaikh, F. S. (2023). Neural attention model for abstractive text summarization using linguistic feature space. IEEE Access, 11, 23557–23564. https://doi.org/10.1109/ACCESS.2023.3249783 DOI: https://doi.org/10.1109/ACCESS.2023.3249783

Gambhir, M., & Gupta, V. (2022). Deep learning-based extractive text summarization with word-level attention mechanism. Multimedia Tools and Applications, 81(15), 20829–20852. https://doi.org/10.1007/s11042-022-12729-y DOI: https://doi.org/10.1007/s11042-022-12729-y

Gnanamalar, A. J., Bhavani, R., & Arulini, A. S. (2023). CNN–SVM based fault detection, classification and location of multi-terminal VSC–HVDC system. Journal of Electrical Engineering & Technology, 18, 3335–3347. https://doi.org/10.1007/s42835-023-01391-5 DOI: https://doi.org/10.1007/s42835-023-01391-5

Gudakahriz, S. J., Moghadam, A. M. E., & Mahmoudi, F. (2023). Opinion texts summarization based on texts concepts with multi-objective pruning approach. The Journal of Supercomputing, 79(5), 5013–5036. https://doi.org/10.1007/s11227-022-04842-4 DOI: https://doi.org/10.1007/s11227-022-04842-4

Guetari, R., & Kraiem, N. (2023). CoMod: An abstractive approach to discourse context identification. IEEE Access. https://doi.org/10.1109/ACCESS.2023.3302179 DOI: https://doi.org/10.1109/ACCESS.2023.3302179

Guo, Q., Huang, J., Xiong, N., & Wang, P. (2019). MS-pointer network: Abstractive text summary based on multi-head self-attention. IEEE Access, 7, 138603–138613. https://doi.org/10.1109/ACCESS.2019.2941964 DOI: https://doi.org/10.1109/ACCESS.2019.2941964

Jang, H., & Kim, W. (2021). Reinforced abstractive text summarization with semantic added reward. IEEE Access, 9, 103804–103810. https://doi.org/10.1109/ACCESS.2021.3097087 DOI: https://doi.org/10.1109/ACCESS.2021.3097087

Jiang, J., Zhang, H., Dai, C., Zhao, Q., Feng, H., Ji, Z., & Ganchev, I. (2021). Enhancements of attention-based bidirectional LSTM for hybrid automatic text summarization. IEEE Access, 9, 123660–123671. https://doi.org/10.1109/ACCESS.2021.3110143 DOI: https://doi.org/10.1109/ACCESS.2021.3110143

Joshi, A., Fidalgo, E., Alegre, E., & Fernández-Robles, L. (2023). DeepSumm: Exploiting topic models and sequence to sequence networks for extractive text summarization. Expert Systems with Applications, 211, 118442. https://doi.org/10.1016/j.eswa.2022.118442 DOI: https://doi.org/10.1016/j.eswa.2022.118442

Moratanch, N., & Chitrakala, S. (2018). A novel framework for semantic oriented abstractive text summarization. Journal of Web Engineering, 17(8), 675–715. https://doi.org/10.13052/jwe1540-9589.1784 DOI: https://doi.org/10.13052/jwe1540-9589.1784

Movassagh, A. A., Alzubi, J. A., Gheisari, M., Rahimi, M., Mohan, S., Abbasi, A. A., & Nabipour, N. (2023). Artificial neural networks training algorithm integrating invasive weed optimization with differential evolutionary model. Journal of Ambient Intelligence and Humanized Computing. https://doi.org/10.1007/s12652-020-02623-6 DOI: https://doi.org/10.1007/s12652-020-02623-6

Raza, H., & Shahzad, W. (2024). End to end Urdu abstractive text summarization with dataset and improvement in evaluation metric. IEEE Access. https://doi.org/10.1109/ACCESS.2024.3377463 DOI: https://doi.org/10.1109/ACCESS.2024.3377463

Shin, J., Park, S. B., & Song, H. J. (2023). Token-level fact correction in abstractive summarization. IEEE Access, 11, 1934–1943. https://doi.org/10.1109/ACCESS.2022.3233854 DOI: https://doi.org/10.1109/ACCESS.2022.3233854

Shin, Y. (2023). Multi-encoder transformer for Korean abstractive text summarization. IEEE Access, 11, 48768–48782. https://doi.org/10.1109/ACCESS.2023.3277754 DOI: https://doi.org/10.1109/ACCESS.2023.3277754

Tank, M., & Thakkar, P. (2024). Abstractive text summarization using adversarial learning and deep neural network. Multimedia Tools and Applications, 83(17), 50849–50870. https://doi.org/10.1007/s11042-023-17478-0 DOI: https://doi.org/10.1007/s11042-023-17478-0

Ulker, M., & Ozer, A. B. (2024). Abstractive summarization model for summarizing scientific article. IEEE Access. https://doi.org/10.1109/ACCESS.2024.3420163 DOI: https://doi.org/10.20944/preprints202405.1123.v1

You, F., Zhao, S., & Chen, J. (2020). A topic information fusion and semantic relevance for text summarization. IEEE Access, 8, 178946–178953. https://doi.org/10.1109/ACCESS.2020.2999665 DOI: https://doi.org/10.1109/ACCESS.2020.2999665

Zhuang, H., & Zhang, W. (2019). Generating semantically similar and human-readable summaries with generative adversarial networks. IEEE Access, 7, 169426–169433. https://doi.org/10.1109/ACCESS.2019.2955087 DOI: https://doi.org/10.1109/ACCESS.2019.2955087

Downloads

Publicado

2025-06-17

Como Citar

Dhinakaran, V. ., Baihan, A. ., Alhussen, A. ., & Paramasivan, R. (2025). Bi-STAB: Enhanced abstractive text summarization model using hybridized deep learning network. Acta Scientiarum. Technology, 47(1). https://doi.org/10.4025/actascitechnol.v47i1.71378

Edição

Seção

Ciência da Computação

 

0.8
2019CiteScore
 
 
36th percentile
Powered by  Scopus

 

 

0.8
2019CiteScore
 
 
36th percentile
Powered by  Scopus