An Integer-Fractional Gradient Algorithm for Back Propagation Neural Networks

This paper proposes a new optimization algorithm for backpropagation (BP) neural networks by fusing integer-order differentiation and fractional-order differentiation, while fractional-order differentiation has significant advantages in describing complex phenomena with long-term memory effects and...

Full description

Bibliographic Details
Main Authors: Zhang, Yiqun, Xu, Honglei, Li, Yang, Lin, Gang, Zhang, Liyuan, Tao, Chaoyang, Wu, Yonghong
Format: Journal Article
Published: 2024
Online Access:http://purl.org/au-research/grants/arc/LP160100528
http://hdl.handle.net/20.500.11937/96289
_version_ 1848766128683220992
author Zhang, Yiqun
Xu, Honglei
Li, Yang
Lin, Gang
Zhang, Liyuan
Tao, Chaoyang
Wu, Yonghong
author_facet Zhang, Yiqun
Xu, Honglei
Li, Yang
Lin, Gang
Zhang, Liyuan
Tao, Chaoyang
Wu, Yonghong
author_sort Zhang, Yiqun
building Curtin Institutional Repository
collection Online Access
description This paper proposes a new optimization algorithm for backpropagation (BP) neural networks by fusing integer-order differentiation and fractional-order differentiation, while fractional-order differentiation has significant advantages in describing complex phenomena with long-term memory effects and nonlocality, its application in neural networks is often limited by a lack of physical interpretability and inconsistencies with traditional models. To address these challenges, we propose a mixed integer-fractional (MIF) gradient descent algorithm for the training of neural networks. Furthermore, a detailed convergence analysis of the proposed algorithm is provided. Finally, numerical experiments illustrate that the new gradient descent algorithm not only speeds up the convergence of the BP neural networks but also increases their classification accuracy.
first_indexed 2025-11-14T11:46:13Z
format Journal Article
id curtin-20.500.11937-96289
institution Curtin University Malaysia
institution_category Local University
last_indexed 2025-11-14T11:46:13Z
publishDate 2024
recordtype eprints
repository_type Digital Repository
spelling curtin-20.500.11937-962892024-11-26T00:57:13Z An Integer-Fractional Gradient Algorithm for Back Propagation Neural Networks Zhang, Yiqun Xu, Honglei Li, Yang Lin, Gang Zhang, Liyuan Tao, Chaoyang Wu, Yonghong This paper proposes a new optimization algorithm for backpropagation (BP) neural networks by fusing integer-order differentiation and fractional-order differentiation, while fractional-order differentiation has significant advantages in describing complex phenomena with long-term memory effects and nonlocality, its application in neural networks is often limited by a lack of physical interpretability and inconsistencies with traditional models. To address these challenges, we propose a mixed integer-fractional (MIF) gradient descent algorithm for the training of neural networks. Furthermore, a detailed convergence analysis of the proposed algorithm is provided. Finally, numerical experiments illustrate that the new gradient descent algorithm not only speeds up the convergence of the BP neural networks but also increases their classification accuracy. 2024 Journal Article http://hdl.handle.net/20.500.11937/96289 10.3390/a17050220 http://purl.org/au-research/grants/arc/LP160100528 https://creativecommons.org/licenses/by/4.0/ fulltext
spellingShingle Zhang, Yiqun
Xu, Honglei
Li, Yang
Lin, Gang
Zhang, Liyuan
Tao, Chaoyang
Wu, Yonghong
An Integer-Fractional Gradient Algorithm for Back Propagation Neural Networks
title An Integer-Fractional Gradient Algorithm for Back Propagation Neural Networks
title_full An Integer-Fractional Gradient Algorithm for Back Propagation Neural Networks
title_fullStr An Integer-Fractional Gradient Algorithm for Back Propagation Neural Networks
title_full_unstemmed An Integer-Fractional Gradient Algorithm for Back Propagation Neural Networks
title_short An Integer-Fractional Gradient Algorithm for Back Propagation Neural Networks
title_sort integer-fractional gradient algorithm for back propagation neural networks
url http://purl.org/au-research/grants/arc/LP160100528
http://hdl.handle.net/20.500.11937/96289