Enhancing sentiment analysis through deep layer integration with long short-term memory networks

Parul Dubey, Pushkar Dubey, Hitesh Gehani

Abstract


This involves studying one of the most important parts of natural language processing (NLP): sentiment, or whether a thing that makes a sentence is neutral, positive, or negative. This paper presents an enhanced long short-term memory (LSTM) network for the sentiment analysis task using an additional deep layer to capture sublevel patterns from the word input. So, the process that we followed in our approach is that we cleaned the data, preprocessed it, built the model, trained the model, and finally tested it. The novelty here lies in the additional layer in the architecture of LSTM model, which improves the model performance. We added a deep layer with the intention of improving accuracy and generalizing the model. The results of the experiment are analyzed using recall, F1-score, and accuracy, which in turn show that the deep-layered LSTM model gives us a better prediction. The LSTM model outperformed the baseline in terms of accuracy, recall, and f1-score. The deep layer's forecast accuracy increased dramatically once it was trained to capture intricate sequences. However, the improved model overfitted, necessitating additional regularization and hyperparameter adjustment. In this paper, we have discussed the advantages and disadvantages of using deep layers in LSTM networks and their application to developing models for deep learning with better-performing sentiment analysis.

Keywords


Deep learning; Long short-term memory; Natural language processing; Sentiment analysis; Text classification

Full Text:

PDF


DOI: http://doi.org/10.11591/ijece.v15i1.pp949-957

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578

This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).