TY - GEN
T1 - Optimizing Binary Classification Performance in Neural Networks Through Simulation: A Comparative Study of Activation Functions
AU - Boateng, Alexander
AU - Aidoo, Eric Nimako
AU - Maposa, Daniel
AU - Odoom, Christopher
AU - Owusu, Samuel Adjei
PY - 2025/1/10
Y1 - 2025/1/10
N2 - Binary classification using Artificial Neural Networks (ANN) is a fundamental problem in machine learning, and the choice of activation functions plays a vital role in determining the performance of the model. This study investigates the performance of various activation functions for binary classification tasks using neural networks across multiple datasets. The results show that ReLU and Tanh, compared to Logistic, consistently excel in accuracy, precision, recall, F1-Score, and AUC-ROC, making them versatile choices for diverse tasks. However, Identity exhibits variable performance, highlighting the need for careful activation function selection based on task specifics. Additionally, the study emphasizes the impact of the dataset size on activation function performance, with ReLU and Tanh offering consistency across varying data volumes. Practitioners can leverage these insights to optimize neural network designs, improving model efficacy in binary classification tasks.
AB - Binary classification using Artificial Neural Networks (ANN) is a fundamental problem in machine learning, and the choice of activation functions plays a vital role in determining the performance of the model. This study investigates the performance of various activation functions for binary classification tasks using neural networks across multiple datasets. The results show that ReLU and Tanh, compared to Logistic, consistently excel in accuracy, precision, recall, F1-Score, and AUC-ROC, making them versatile choices for diverse tasks. However, Identity exhibits variable performance, highlighting the need for careful activation function selection based on task specifics. Additionally, the study emphasizes the impact of the dataset size on activation function performance, with ReLU and Tanh offering consistency across varying data volumes. Practitioners can leverage these insights to optimize neural network designs, improving model efficacy in binary classification tasks.
KW - Binary classification
KW - Logistic/Sigmoid
KW - Identity
KW - Neural Networks
KW - ReLU
KW - Tanh
UR - https://www.scopus.com/pages/publications/85218463924
U2 - 10.1007/978-3-031-73324-6_54
DO - 10.1007/978-3-031-73324-6_54
M3 - Conference contribution
SN - 9783031733239
T3 - Lecture Notes in Networks and Systems
SP - 555
EP - 568
BT - Intelligent Computing and Optimization
PB - Springer
T2 - 7th International Conference on Intelligent Computing and Optimization 2023
Y2 - 26 October 2023 through 27 October 2023
ER -