Abstract
Suicide continues to be an issue in our society. Studies agree that it is best to deal with suicidal ideation in its early stages, and for this reason, researchers have been conducting experiments training different Neural Networks (NN)
to detect it. Transformers are the dominant Neural Network architecture in the domain of Suicidal Ideation detection, being a robust solution for not only the proposed problem but for a wide variety of NLP problems. LSTM-CNN, one of the prominent architectures in the field is also proposed as a great solution. This study aims to evaluate the performance of BERT, RoBERTa, LSTM-CNN, and Bi-LSTM-CNN models for suicidal ideation detection. Our experiments indicated that BERT models have an edge over both LSTM-CNN and BI-LSTM-CNN models, scoring up to 0.986 accuracy on our test set. Furthermore, while directly comparing LSTM-CNN with Bi-LSTM-CNN, it was observed that the difference between the models isn’t significant. Our paper contributes to the domain by proving no advantage of using LSTM-CNN models over the Transformers.
to detect it. Transformers are the dominant Neural Network architecture in the domain of Suicidal Ideation detection, being a robust solution for not only the proposed problem but for a wide variety of NLP problems. LSTM-CNN, one of the prominent architectures in the field is also proposed as a great solution. This study aims to evaluate the performance of BERT, RoBERTa, LSTM-CNN, and Bi-LSTM-CNN models for suicidal ideation detection. Our experiments indicated that BERT models have an edge over both LSTM-CNN and BI-LSTM-CNN models, scoring up to 0.986 accuracy on our test set. Furthermore, while directly comparing LSTM-CNN with Bi-LSTM-CNN, it was observed that the difference between the models isn’t significant. Our paper contributes to the domain by proving no advantage of using LSTM-CNN models over the Transformers.
| Original language | English |
|---|---|
| Title of host publication | 5th International Conference on Computers and Artificial Intelligence Technology (CAIT) |
| Publisher | IEEE |
| Pages | 586-590 |
| Number of pages | 5 |
| ISBN (Electronic) | 9798331530891 |
| DOIs | |
| Publication status | Published - 17 Apr 2025 |
| Event | 5th International Conference on Computers and Artificial Intelligence Technology 2024 - Hangzhou, China Duration: 20 Dec 2024 → 22 Dec 2024 Conference number: 5th https://www.cait.net/2024.html |
Conference
| Conference | 5th International Conference on Computers and Artificial Intelligence Technology 2024 |
|---|---|
| Abbreviated title | CAIT 2024 |
| Country/Territory | China |
| City | Hangzhou |
| Period | 20/12/24 → 22/12/24 |
| Internet address |
Keywords
- suicide
- depression
- machine learning
- NLP
- natural language processing
- deep learning