Hyperparameter tuning of deep neural network in time series forecasting

Deep Artificial Neural Network (DANN) is a type of Artificial Neural Network (ANN) with multiple hidden layers, making them a 'deep' form of ANN. Since ANN is a type of Deep Neural Network (DNN), DANNs fall under the broader DNN category and are widely used in time series forecasting. The...

Full description

Bibliographic Details
Main Authors: Abdul Halim, Syafrina, Abd Rahman, Nur Haizum, Xiang, Kelly Pang Li
Format: Article
Language:English
Published: UPM Press 2024
Online Access:http://psasir.upm.edu.my/id/eprint/120371/
http://psasir.upm.edu.my/id/eprint/120371/1/120371.pdf
_version_ 1848868171507826688
author Abdul Halim, Syafrina
Abd Rahman, Nur Haizum
Xiang, Kelly Pang Li
author_facet Abdul Halim, Syafrina
Abd Rahman, Nur Haizum
Xiang, Kelly Pang Li
author_sort Abdul Halim, Syafrina
building UPM Institutional Repository
collection Online Access
description Deep Artificial Neural Network (DANN) is a type of Artificial Neural Network (ANN) with multiple hidden layers, making them a 'deep' form of ANN. Since ANN is a type of Deep Neural Network (DNN), DANNs fall under the broader DNN category and are widely used in time series forecasting. The performance of DANN is highly dependent on the choice of hyperparameters. Random selection of the hyperparameters may increase DANN’s forecasting error. Hence, this study aims to optimize the performance of DANN in time series forecasting by tuning two important hyperparameters: the number of epochs and batch size. In this study, DANN with 1, 10, 20, 50 and 100 epochs, and batch sizes of 32 and 64 are used to grid search and form different combinations of hyperparameters. The performances of each model are evaluated and compared based on the mean square error (MSE) and mean absolute error (MAE). In addition, mean absolute percentage error (MAPE) is used to compare the performance of the DANN model on high-frequency and low-frequency time series data. Our study use simulated and real-life data to reveal the performance of the DANN model. The results show more than one epoch is needed to provide good performance. Specifically, analysis of simulated data consistently suggests that 10 epochs offer optimal results. Similarly, 10 epochs yield optimal results for low-frequency real-life data, while high-frequency real-life data prefers 100 epochs. Additionally, the finding indicates that batch sizes of 32 and 64 are optimal when used in different combinations. Hence, this study suggests that, in starting the learning process, it is crucial to perform hyperparameter tuning. This step ensures the selection of appropriate hyperparameter values, which significantly impact the learning outcome of a DNN model, leading to improved forecast accuracy results.
first_indexed 2025-11-15T14:48:09Z
format Article
id upm-120371
institution Universiti Putra Malaysia
institution_category Local University
language English
last_indexed 2025-11-15T14:48:09Z
publishDate 2024
publisher UPM Press
recordtype eprints
repository_type Digital Repository
spelling upm-1203712025-10-01T01:54:51Z http://psasir.upm.edu.my/id/eprint/120371/ Hyperparameter tuning of deep neural network in time series forecasting Abdul Halim, Syafrina Abd Rahman, Nur Haizum Xiang, Kelly Pang Li Deep Artificial Neural Network (DANN) is a type of Artificial Neural Network (ANN) with multiple hidden layers, making them a 'deep' form of ANN. Since ANN is a type of Deep Neural Network (DNN), DANNs fall under the broader DNN category and are widely used in time series forecasting. The performance of DANN is highly dependent on the choice of hyperparameters. Random selection of the hyperparameters may increase DANN’s forecasting error. Hence, this study aims to optimize the performance of DANN in time series forecasting by tuning two important hyperparameters: the number of epochs and batch size. In this study, DANN with 1, 10, 20, 50 and 100 epochs, and batch sizes of 32 and 64 are used to grid search and form different combinations of hyperparameters. The performances of each model are evaluated and compared based on the mean square error (MSE) and mean absolute error (MAE). In addition, mean absolute percentage error (MAPE) is used to compare the performance of the DANN model on high-frequency and low-frequency time series data. Our study use simulated and real-life data to reveal the performance of the DANN model. The results show more than one epoch is needed to provide good performance. Specifically, analysis of simulated data consistently suggests that 10 epochs offer optimal results. Similarly, 10 epochs yield optimal results for low-frequency real-life data, while high-frequency real-life data prefers 100 epochs. Additionally, the finding indicates that batch sizes of 32 and 64 are optimal when used in different combinations. Hence, this study suggests that, in starting the learning process, it is crucial to perform hyperparameter tuning. This step ensures the selection of appropriate hyperparameter values, which significantly impact the learning outcome of a DNN model, leading to improved forecast accuracy results. UPM Press 2024 Article PeerReviewed text en cc_by_nc_4 http://psasir.upm.edu.my/id/eprint/120371/1/120371.pdf Abdul Halim, Syafrina and Abd Rahman, Nur Haizum and Xiang, Kelly Pang Li (2024) Hyperparameter tuning of deep neural network in time series forecasting. Menemui Matematik (Discovering Mathematics), 46 (1). pp. 47-73. ISSN 2231-7023 https://myjms.mohe.gov.my/index.php/dismath/
spellingShingle Abdul Halim, Syafrina
Abd Rahman, Nur Haizum
Xiang, Kelly Pang Li
Hyperparameter tuning of deep neural network in time series forecasting
title Hyperparameter tuning of deep neural network in time series forecasting
title_full Hyperparameter tuning of deep neural network in time series forecasting
title_fullStr Hyperparameter tuning of deep neural network in time series forecasting
title_full_unstemmed Hyperparameter tuning of deep neural network in time series forecasting
title_short Hyperparameter tuning of deep neural network in time series forecasting
title_sort hyperparameter tuning of deep neural network in time series forecasting
url http://psasir.upm.edu.my/id/eprint/120371/
http://psasir.upm.edu.my/id/eprint/120371/
http://psasir.upm.edu.my/id/eprint/120371/1/120371.pdf