Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems

Stochastic gradient descent (SGD) is one of the most common algorithms used in solving large unconstrained optimization problems. It utilizes the concept of classical gradient descent method with modification on the gradient selection. SGD uses random or batch data sets to compute gradient in solvin...

Full description

Bibliographic Details
Main Authors: Zulkifli, Munierah, Abd Rahmin, Nor Aliza, Wah, June Leong
Format: Article
Language:English
Published: Persatuan Sains Matematik Malaysia 2023
Online Access:http://psasir.upm.edu.my/id/eprint/110372/
http://psasir.upm.edu.my/id/eprint/110372/1/document%20%284%29.pdf
_version_ 1848865507963305984
author Zulkifli, Munierah
Abd Rahmin, Nor Aliza
Wah, June Leong
author_facet Zulkifli, Munierah
Abd Rahmin, Nor Aliza
Wah, June Leong
author_sort Zulkifli, Munierah
building UPM Institutional Repository
collection Online Access
description Stochastic gradient descent (SGD) is one of the most common algorithms used in solving large unconstrained optimization problems. It utilizes the concept of classical gradient descent method with modification on the gradient selection. SGD uses random or batch data sets to compute gradient in solving optimization problems. It is an iterative algorithm with descent properties that reduces computational cost by using derivatives of random data points. This paper proposes a new SGD algorithm with modified stepsize that employs function scaling strategy. Particularly, the stepsize parameter is coupled with function scaling by storing the mean of gradients in the denominator. The performance of the method is evaluated based on the ability to reduce function value after each iteration, ability to attain the lowest function value when applied to solve the well-known zebra-strip problem. Our results indicate that the proposed method performed favourable to the existing method.
first_indexed 2025-11-15T14:05:49Z
format Article
id upm-110372
institution Universiti Putra Malaysia
institution_category Local University
language English
last_indexed 2025-11-15T14:05:49Z
publishDate 2023
publisher Persatuan Sains Matematik Malaysia
recordtype eprints
repository_type Digital Repository
spelling upm-1103722024-09-04T03:28:21Z http://psasir.upm.edu.my/id/eprint/110372/ Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems Zulkifli, Munierah Abd Rahmin, Nor Aliza Wah, June Leong Stochastic gradient descent (SGD) is one of the most common algorithms used in solving large unconstrained optimization problems. It utilizes the concept of classical gradient descent method with modification on the gradient selection. SGD uses random or batch data sets to compute gradient in solving optimization problems. It is an iterative algorithm with descent properties that reduces computational cost by using derivatives of random data points. This paper proposes a new SGD algorithm with modified stepsize that employs function scaling strategy. Particularly, the stepsize parameter is coupled with function scaling by storing the mean of gradients in the denominator. The performance of the method is evaluated based on the ability to reduce function value after each iteration, ability to attain the lowest function value when applied to solve the well-known zebra-strip problem. Our results indicate that the proposed method performed favourable to the existing method. Persatuan Sains Matematik Malaysia 2023 Article PeerReviewed text en http://psasir.upm.edu.my/id/eprint/110372/1/document%20%284%29.pdf Zulkifli, Munierah and Abd Rahmin, Nor Aliza and Wah, June Leong (2023) Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems. Menemui Matematik, 45 (2). pp. 224-230. ISSN 2231-7023 https://myjms.mohe.gov.my/index.php/dismath/article/view/24687
spellingShingle Zulkifli, Munierah
Abd Rahmin, Nor Aliza
Wah, June Leong
Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems
title Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems
title_full Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems
title_fullStr Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems
title_full_unstemmed Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems
title_short Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems
title_sort improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems
url http://psasir.upm.edu.my/id/eprint/110372/
http://psasir.upm.edu.my/id/eprint/110372/
http://psasir.upm.edu.my/id/eprint/110372/1/document%20%284%29.pdf