Abstract
In recent years, text has been the main form of communication on social media platforms such as Twitter, Reddit, Facebook, Instagram and YouTube. Emotion Recognition from these platforms can be exploited for all sorts of applications. Through the means of a review of the current literature, it was found that Transformer-based deep learning models show very promising results when trained and fine-tuned for emotion recognition tasks. This paper provides an overview of the architecture for three of the most popular Transformer-based models, BERT Base, DistilBERT, and RoBERTa. These models are also fine-tuned using the “Emotions” dataset; a data corpus composed of English tweets annotated in six (6) different emotions, and the performance of the models is evaluated. The results of this experiment showed that while all of the models demonstrated excellent emotion recognition capabilities by obtaining over 92% F1-score, DistilBERT could be trained in nearly half of the time compared to the other models. Thus, the use of DistilBERT for emotion recognition tasks is encouraged.
Original language | English |
---|---|
Title of host publication | ICISS '23: Proceedings of the 2023 6th International Conference on Information Science and Systems |
Publisher | Association for Computing Machinery |
Pages | 113-118 |
Number of pages | 6 |
ISBN (Print) | 9798400708206 |
DOIs | |
Publication status | Published - 21 Nov 2023 |
Event | 6th International Conference on Information Science and Systems 2023 - Edinburgh, United Kingdom Duration: 11 Aug 2023 → 13 Aug 2023 https://conferencealerts.com/show-event?id=248920 |
Conference
Conference | 6th International Conference on Information Science and Systems 2023 |
---|---|
Abbreviated title | ICISS 2023 |
Country/Territory | United Kingdom |
City | Edinburgh |
Period | 11/08/23 → 13/08/23 |
Internet address |