The effect of adaptive parameters on the performance of back propagation

The Back Propagation algorithm or its variation on Multilayered Feedforward Networks is widely used in many applications. However, this algorithm is well-known to have difficulties with local minima problem particularly caused by neuron saturation in the hidden layer. Most existing approaches mod...

Full description

Bibliographic Details
Main Author: Abdul Hamid, Norhamreeza
Format: Thesis
Language:English
English
English
Published: 2012
Subjects:
Online Access:http://eprints.uthm.edu.my/2344/
http://eprints.uthm.edu.my/2344/1/24p%20NORHAMREEZA%20ABDUL%20HAMID.pdf
http://eprints.uthm.edu.my/2344/2/NORHAMREEZA%20ABDUL%20HAMID%20COPYRIGHT%20DECLARATION.pdf
http://eprints.uthm.edu.my/2344/3/NORHAMREEZA%20ABDUL%20HAMID%20WATERMARK.pdf
_version_ 1848887714119680000
author Abdul Hamid, Norhamreeza
author_facet Abdul Hamid, Norhamreeza
author_sort Abdul Hamid, Norhamreeza
building UTHM Institutional Repository
collection Online Access
description The Back Propagation algorithm or its variation on Multilayered Feedforward Networks is widely used in many applications. However, this algorithm is well-known to have difficulties with local minima problem particularly caused by neuron saturation in the hidden layer. Most existing approaches modify the learning model in order to add a random factor to the model, which overcomes the tendency to sink into local minima. However, the random perturbations of the search direction and various kinds of stochastic adjustment to the current set of weights are not effective in enabling a network to escape from local minima which cause the network fail to converge to a global minimum within a reasonable number of iterations. Thus, this research proposed a new method known as Back Propagation Gradient Descent with Adaptive Gain, Adaptive Momentum and Adaptive Learning Rate (BPGD-AGAMAL) which modifies the existing Back Propagation Gradient Descent algorithm by adaptively changing the gain, momentum coefficient and learning rate. In this method, each training pattern has its own activation functions of neurons in the hidden layer. The activation functions are adjusted by the adaptation of gain parameters together with adaptive momentum and learning rate value during the learning process. The efficiency of the proposed algorithm is compared with conventional Back Propagation Gradient Descent and Back Propagation Gradient Descent with Adaptive Gain by means of simulation on six benchmark problems namely breast cancer, card, glass, iris, soybean, and thyroid. The results show that the proposed algorithm extensively improves the learning process of conventional Back Propagation algorithm.
first_indexed 2025-11-15T19:58:46Z
format Thesis
id uthm-2344
institution Universiti Tun Hussein Onn Malaysia
institution_category Local University
language English
English
English
last_indexed 2025-11-15T19:58:46Z
publishDate 2012
recordtype eprints
repository_type Digital Repository
spelling uthm-23442021-10-31T06:53:49Z http://eprints.uthm.edu.my/2344/ The effect of adaptive parameters on the performance of back propagation Abdul Hamid, Norhamreeza Q Science (General) Q300-390 Cybernetics The Back Propagation algorithm or its variation on Multilayered Feedforward Networks is widely used in many applications. However, this algorithm is well-known to have difficulties with local minima problem particularly caused by neuron saturation in the hidden layer. Most existing approaches modify the learning model in order to add a random factor to the model, which overcomes the tendency to sink into local minima. However, the random perturbations of the search direction and various kinds of stochastic adjustment to the current set of weights are not effective in enabling a network to escape from local minima which cause the network fail to converge to a global minimum within a reasonable number of iterations. Thus, this research proposed a new method known as Back Propagation Gradient Descent with Adaptive Gain, Adaptive Momentum and Adaptive Learning Rate (BPGD-AGAMAL) which modifies the existing Back Propagation Gradient Descent algorithm by adaptively changing the gain, momentum coefficient and learning rate. In this method, each training pattern has its own activation functions of neurons in the hidden layer. The activation functions are adjusted by the adaptation of gain parameters together with adaptive momentum and learning rate value during the learning process. The efficiency of the proposed algorithm is compared with conventional Back Propagation Gradient Descent and Back Propagation Gradient Descent with Adaptive Gain by means of simulation on six benchmark problems namely breast cancer, card, glass, iris, soybean, and thyroid. The results show that the proposed algorithm extensively improves the learning process of conventional Back Propagation algorithm. 2012-04 Thesis NonPeerReviewed text en http://eprints.uthm.edu.my/2344/1/24p%20NORHAMREEZA%20ABDUL%20HAMID.pdf text en http://eprints.uthm.edu.my/2344/2/NORHAMREEZA%20ABDUL%20HAMID%20COPYRIGHT%20DECLARATION.pdf text en http://eprints.uthm.edu.my/2344/3/NORHAMREEZA%20ABDUL%20HAMID%20WATERMARK.pdf Abdul Hamid, Norhamreeza (2012) The effect of adaptive parameters on the performance of back propagation. Masters thesis, Universiti Tun Hussein Malaysia.
spellingShingle Q Science (General)
Q300-390 Cybernetics
Abdul Hamid, Norhamreeza
The effect of adaptive parameters on the performance of back propagation
title The effect of adaptive parameters on the performance of back propagation
title_full The effect of adaptive parameters on the performance of back propagation
title_fullStr The effect of adaptive parameters on the performance of back propagation
title_full_unstemmed The effect of adaptive parameters on the performance of back propagation
title_short The effect of adaptive parameters on the performance of back propagation
title_sort effect of adaptive parameters on the performance of back propagation
topic Q Science (General)
Q300-390 Cybernetics
url http://eprints.uthm.edu.my/2344/
http://eprints.uthm.edu.my/2344/1/24p%20NORHAMREEZA%20ABDUL%20HAMID.pdf
http://eprints.uthm.edu.my/2344/2/NORHAMREEZA%20ABDUL%20HAMID%20COPYRIGHT%20DECLARATION.pdf
http://eprints.uthm.edu.my/2344/3/NORHAMREEZA%20ABDUL%20HAMID%20WATERMARK.pdf