A Hybrid VADER-BERT Sentiment Analysis Framework for Real-Time Stock Market Prediction
Resumen
The stock market, significantly shaped by public sentiment as articulated through various media channels, social networking sites, and financial discussions, demonstrates a vibrant adaptability to the collective perceptions of investors. This research presents an advanced real-time sentiment analysis framework that effectively merges VADER (Valence Aware Dictionary for Sentiment Reasoning) and BERT (Bidirectional Encoder Representations from Transformers), representing a substantial progression compared to prior methodologies. Previous methods, which depend on individual models such as VADER or BERT, are hindered by intrinsic drawbacks: VADER’s rule-based framework struggles to interpret complex textual nuances, whereas BERT’s resource-intensive nature constrains prompt analysis. In contrast, the proposed system leverages VADER’s rapid processing capabilities for succinct, informal texts together with BERT’s sophisticated understanding of intricate financial language. This synthesis allows for the accurate identification of both explicit and implicit sentiment indicators, thus providing a more holistic depiction of market sentiment. Sentiment indices generated from this analysis are systematically correlated with fluctuations in stock prices to assess their predictive value. Additionally, a robust web-based application has been created to support real-time sentiment tracking for selected equities. Empirical assessments confirm that this hybrid framework considerably outperforms the efficacy of conventional single-model systems in terms of both accuracy and speed, establishing a powerful tool for investors and financial researchers.
Descargas
Derechos de autor 2025 Boletim da Sociedade Paranaense de Matemática

Esta obra está bajo licencia internacional Creative Commons Reconocimiento 4.0.
When the manuscript is accepted for publication, the authors agree automatically to transfer the copyright to the (SPM).
The journal utilize the Creative Common Attribution (CC-BY 4.0).