TY - GEN
T1 - Attention Models for Sentiment Analysis Using Objectivity and Subjectivity Word Vectors
AU - Lee, Wing Shum
AU - Ng, Hu
AU - Yap, Timothy Tzen Vun
AU - Ho, Chiung Ching
AU - Goh, Vik Tor
AU - Tong, Hau Lee
N1 - Funding Information:
Acknowledgements This work was supported by the the Ministry of Higher Education, Malaysia, under the Fundamental Research Grant Scheme with grant number FRGS/1/2018/ICT02/MMU/03/6 and Multimedia University, under the CAPEX fund with grant number MMUI/CAPEX170008.
Publisher Copyright:
© 2021, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
PY - 2021/3/16
Y1 - 2021/3/16
N2 - In this research, we look at the notions of objectivity and subjectivity and create word embeddings from them for the purpose of sentiment analysis. We created word vectors from two datasets, the Wikipedia English Dataset for objectivity and the Amazon Product Reviews Data dataset for subjectivity. A model incorporating an Attention Mechanism was proposed. The proposed Attention model was compared to Logistic Regression, Linear Support Vector Classification models, and the former was able to achieve the highest accuracy with large enough data through augmentation. In the case of objectivity and subjectivity, models trained with the objectivity word embeddings performed worse than their counterpart. However, when compared to the BERT model, a model also with Attention Mechanism but has its own word embedding technique, the BERT model achieved higher accuracy even though model training was performed with only transfer learning.
AB - In this research, we look at the notions of objectivity and subjectivity and create word embeddings from them for the purpose of sentiment analysis. We created word vectors from two datasets, the Wikipedia English Dataset for objectivity and the Amazon Product Reviews Data dataset for subjectivity. A model incorporating an Attention Mechanism was proposed. The proposed Attention model was compared to Logistic Regression, Linear Support Vector Classification models, and the former was able to achieve the highest accuracy with large enough data through augmentation. In the case of objectivity and subjectivity, models trained with the objectivity word embeddings performed worse than their counterpart. However, when compared to the BERT model, a model also with Attention Mechanism but has its own word embedding technique, the BERT model achieved higher accuracy even though model training was performed with only transfer learning.
KW - Objectivity
KW - Sentiment analysis
KW - Subjectivity
KW - Word vectors
UR - http://www.scopus.com/inward/record.url?scp=85103516514&partnerID=8YFLogxK
U2 - 10.1007/978-981-33-4069-5_5
DO - 10.1007/978-981-33-4069-5_5
M3 - Conference contribution
AN - SCOPUS:85103516514
SN - 9789813340688
T3 - Lecture Notes in Electrical Engineering
SP - 51
EP - 59
BT - Computational Science and Technology
A2 - Alfred, Rayner
A2 - Iida, Hiroyuki
A2 - Haviluddin, Haviluddin
A2 - Anthony, Patricia
PB - Springer
T2 - 7th International Conference on Computational Science and Technology 2020
Y2 - 29 August 2020 through 30 August 2020
ER -