Recent advances in meta-heuristic algorithms for training multilayer perceptron neural networks
Artificial Neural Networks (ANNs) have demonstrated applicability and effectiveness in several domains, including classification tasks. Researchers have emphasized the training techniques of ANNs to identify appropriate weights and biases. However, conventional training techniques such as Gradient D...
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Politeknik Negeri Padang
2025
|
| Online Access: | http://psasir.upm.edu.my/id/eprint/118626/ http://psasir.upm.edu.my/id/eprint/118626/1/118626.pdf |
| _version_ | 1848867557859131392 |
|---|---|
| author | Al-Asaady, Maher Talal Mohd Aris, Teh Noranis Mohd Sharef, Nurfadhlina Hamdan, Hazlina |
| author_facet | Al-Asaady, Maher Talal Mohd Aris, Teh Noranis Mohd Sharef, Nurfadhlina Hamdan, Hazlina |
| author_sort | Al-Asaady, Maher Talal |
| building | UPM Institutional Repository |
| collection | Online Access |
| description | Artificial Neural Networks (ANNs) have demonstrated applicability and effectiveness in several domains, including classification tasks. Researchers have emphasized the training techniques of ANNs to identify appropriate weights and biases. However, conventional training techniques such as Gradient Descent (GD) and Backpropagation (BP) often suffer from early convergence, dependence on initial parameters, and susceptibility to local optima, limiting their efficiency in complex, high-dimensional problems. Meta-heuristic algorithms (MHAs) offer a promising alternative as practical approaches for training ANNs, providing global search capabilities, robustness, and improved computational efficiency. Despite the growing use of MHAs, existing studies often focus on specific subsets of algorithms or narrow application domains, leaving a gap in understanding their comprehensive potential and comparative performance across diverse classification tasks. This paper addresses this gap by presenting a systematic review of advancements in training Multilayer Perceptron (MLP) neural networks using MHAs, analyzing 53 publications from 2014 to 2024. The research papers were chosen explicitly from four widely used databases: ScienceDirect, Scopus, Springer, and IEEE Xplore. Key contributions include a comparative analysis of evolutionary, swarm intelligence, physics-based, human-inspired algorithms, and hybrid approaches benchmarked on classification datasets. The study also highlights bibliometric trends, identifies underexplored areas such as adaptive and hybrid algorithms, and emphasizes the practical application of MHAs in optimizing ANN performance. This work is a significant resource for researchers, facilitating the identification of effective optimization methodologies and bridging the gap between theoretical advancements and real-world applications. |
| first_indexed | 2025-11-15T14:38:24Z |
| format | Article |
| id | upm-118626 |
| institution | Universiti Putra Malaysia |
| institution_category | Local University |
| language | English |
| last_indexed | 2025-11-15T14:38:24Z |
| publishDate | 2025 |
| publisher | Politeknik Negeri Padang |
| recordtype | eprints |
| repository_type | Digital Repository |
| spelling | upm-1186262025-07-21T00:17:51Z http://psasir.upm.edu.my/id/eprint/118626/ Recent advances in meta-heuristic algorithms for training multilayer perceptron neural networks Al-Asaady, Maher Talal Mohd Aris, Teh Noranis Mohd Sharef, Nurfadhlina Hamdan, Hazlina Artificial Neural Networks (ANNs) have demonstrated applicability and effectiveness in several domains, including classification tasks. Researchers have emphasized the training techniques of ANNs to identify appropriate weights and biases. However, conventional training techniques such as Gradient Descent (GD) and Backpropagation (BP) often suffer from early convergence, dependence on initial parameters, and susceptibility to local optima, limiting their efficiency in complex, high-dimensional problems. Meta-heuristic algorithms (MHAs) offer a promising alternative as practical approaches for training ANNs, providing global search capabilities, robustness, and improved computational efficiency. Despite the growing use of MHAs, existing studies often focus on specific subsets of algorithms or narrow application domains, leaving a gap in understanding their comprehensive potential and comparative performance across diverse classification tasks. This paper addresses this gap by presenting a systematic review of advancements in training Multilayer Perceptron (MLP) neural networks using MHAs, analyzing 53 publications from 2014 to 2024. The research papers were chosen explicitly from four widely used databases: ScienceDirect, Scopus, Springer, and IEEE Xplore. Key contributions include a comparative analysis of evolutionary, swarm intelligence, physics-based, human-inspired algorithms, and hybrid approaches benchmarked on classification datasets. The study also highlights bibliometric trends, identifies underexplored areas such as adaptive and hybrid algorithms, and emphasizes the practical application of MHAs in optimizing ANN performance. This work is a significant resource for researchers, facilitating the identification of effective optimization methodologies and bridging the gap between theoretical advancements and real-world applications. Politeknik Negeri Padang 2025 Article PeerReviewed text en cc_by_sa_4 http://psasir.upm.edu.my/id/eprint/118626/1/118626.pdf Al-Asaady, Maher Talal and Mohd Aris, Teh Noranis and Mohd Sharef, Nurfadhlina and Hamdan, Hazlina (2025) Recent advances in meta-heuristic algorithms for training multilayer perceptron neural networks. International Journal on Informatics Visualization, 9 (2). pp. 658-673. ISSN 2549-9610; eISSN: 2549-9904 http://joiv.org/index.php/joiv/article/view/3109 10.62527/joiv.9.2.3109 |
| spellingShingle | Al-Asaady, Maher Talal Mohd Aris, Teh Noranis Mohd Sharef, Nurfadhlina Hamdan, Hazlina Recent advances in meta-heuristic algorithms for training multilayer perceptron neural networks |
| title | Recent advances in meta-heuristic algorithms for training multilayer perceptron neural networks |
| title_full | Recent advances in meta-heuristic algorithms for training multilayer perceptron neural networks |
| title_fullStr | Recent advances in meta-heuristic algorithms for training multilayer perceptron neural networks |
| title_full_unstemmed | Recent advances in meta-heuristic algorithms for training multilayer perceptron neural networks |
| title_short | Recent advances in meta-heuristic algorithms for training multilayer perceptron neural networks |
| title_sort | recent advances in meta-heuristic algorithms for training multilayer perceptron neural networks |
| url | http://psasir.upm.edu.my/id/eprint/118626/ http://psasir.upm.edu.my/id/eprint/118626/ http://psasir.upm.edu.my/id/eprint/118626/ http://psasir.upm.edu.my/id/eprint/118626/1/118626.pdf |