TY - GEN
T1 - Optimizing Artificial Neural Network for Functions Approximation Using Particle Swarm Optimization
AU - Zaghloul, Lina
AU - Zaghloul, Rawan
AU - Hamdan, Mohammad
N1 - Publisher Copyright:
© 2021, Springer Nature Switzerland AG.
PY - 2021/7/7
Y1 - 2021/7/7
N2 - Artificial neural networks (ANN) are commonly used in function approximation as well as classification problems. This paper shows a configurable architecture of a simple feed forward neural network trained by particle swarm optimization (PSO) algorithm. PSO and ANN have several hyperparameters that have impact on the results of approximation. ANN parameters are the number of layers, number of neurons in each layer, and neuron activation functions. The hyperparameters of the PSO are the population size, the number of informants per particle, and the acceleration coefficients. Herein, this work comes to spot the light on how the PSO hyperparameters affect the ability of the algorithm to optimize ANNs weights in the function approximation task. This was examined and tested by generating multiple experiments on different types of input functions such as: cubic, linear, XOR problem. The results of the proposed method show the superiority of PSO compared to backpropagation in terms of MSE.
AB - Artificial neural networks (ANN) are commonly used in function approximation as well as classification problems. This paper shows a configurable architecture of a simple feed forward neural network trained by particle swarm optimization (PSO) algorithm. PSO and ANN have several hyperparameters that have impact on the results of approximation. ANN parameters are the number of layers, number of neurons in each layer, and neuron activation functions. The hyperparameters of the PSO are the population size, the number of informants per particle, and the acceleration coefficients. Herein, this work comes to spot the light on how the PSO hyperparameters affect the ability of the algorithm to optimize ANNs weights in the function approximation task. This was examined and tested by generating multiple experiments on different types of input functions such as: cubic, linear, XOR problem. The results of the proposed method show the superiority of PSO compared to backpropagation in terms of MSE.
KW - Artificial Neural Network (ANN)
KW - Backpropagation
KW - Function Approximation
KW - Mean Square Error (MSE)
KW - Particle Swarm Optimization (PSO)
UR - http://www.scopus.com/inward/record.url?scp=85112076549&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-78743-1_20
DO - 10.1007/978-3-030-78743-1_20
M3 - Conference contribution
AN - SCOPUS:85112076549
SN - 9783030787424
T3 - Lecture Notes in Computer Science
SP - 223
EP - 231
BT - Advances in Swarm Intelligence. ICSI 2021
A2 - Tan, Ying
A2 - Shi, Yuhui
PB - Springer
T2 - 12th International Conference on Advances in Swarm Intelligence 2021
Y2 - 17 July 2021 through 21 July 2021
ER -