Optimising LSTM and BiLSTM models for time series forecasting through hyperparameter tuning

Long Short-Term Memory (LSTM) and Bidirectional Long Short-Term Memory (BiLSTM) are the emerging Recurrent Neural Networks (RNN) widely used in time series forecasting. The performance of these neural networks relies on the selection of hyperparameters. A random selection of the hyperparameters may...

Full description

Bibliographic Details
Main Authors: Nur Haizum, Abd Rahman, Yin, Quay Pin, Hani Syahida, Zulkafli
Format: Article
Language:English
Published: Universiti Kebangsaan Malaysia 2025
Subjects:
Online Access:https://umpir.ump.edu.my/id/eprint/45978/
_version_ 1848827540073873408
author Nur Haizum, Abd Rahman
Yin, Quay Pin
Hani Syahida, Zulkafli
author_facet Nur Haizum, Abd Rahman
Yin, Quay Pin
Hani Syahida, Zulkafli
author_sort Nur Haizum, Abd Rahman
building UMP Institutional Repository
collection Online Access
description Long Short-Term Memory (LSTM) and Bidirectional Long Short-Term Memory (BiLSTM) are the emerging Recurrent Neural Networks (RNN) widely used in time series forecasting. The performance of these neural networks relies on the selection of hyperparameters. A random selection of the hyperparameters may increase the forecasting error. Hence, this study aims to optimise the performance of LSTM and BiLSTM in time series forecasting by tuning one of the essential hyperparameters, the number of hidden neurons. LSTM and BiLSTM with 32, 64, and 128 hidden neurons and various combinations of other hyperparameters are formed in this study through grid searching. The models are evaluated and compared based on the Mean Squared Error (MSE) and Mean Absolute Error (MAE). The results from real data analysis revealed that 128 hidden neurons are the optimum choice of hidden neurons with the lowest error values. This study investigates whether BiLSTM, performs better than LSTM in forecasting. Thus, the performance of these two neural networks in forecasting time series data was compared, and the Wilcoxon-Signed Rank Test was conducted. Results revealed a significant difference in the performance of these two neural networks, and BiLSTM outperformed LSTM in forecasting time series data. Hence, BiLSTM with 128 hidden neurons is encouraged to be chosen over LSTM in time series forecasting. Since these findings have implications for future practice, the combination of model and hyperparameter should be chosen wisely to obtain more accurate predictions in time series forecasting.
first_indexed 2025-11-15T04:02:20Z
format Article
id ump-45978
institution Universiti Malaysia Pahang
institution_category Local University
language English
last_indexed 2025-11-15T04:02:20Z
publishDate 2025
publisher Universiti Kebangsaan Malaysia
recordtype eprints
repository_type Digital Repository
spelling ump-459782025-10-23T00:36:43Z https://umpir.ump.edu.my/id/eprint/45978/ Optimising LSTM and BiLSTM models for time series forecasting through hyperparameter tuning Nur Haizum, Abd Rahman Yin, Quay Pin Hani Syahida, Zulkafli HA Statistics QA Mathematics Long Short-Term Memory (LSTM) and Bidirectional Long Short-Term Memory (BiLSTM) are the emerging Recurrent Neural Networks (RNN) widely used in time series forecasting. The performance of these neural networks relies on the selection of hyperparameters. A random selection of the hyperparameters may increase the forecasting error. Hence, this study aims to optimise the performance of LSTM and BiLSTM in time series forecasting by tuning one of the essential hyperparameters, the number of hidden neurons. LSTM and BiLSTM with 32, 64, and 128 hidden neurons and various combinations of other hyperparameters are formed in this study through grid searching. The models are evaluated and compared based on the Mean Squared Error (MSE) and Mean Absolute Error (MAE). The results from real data analysis revealed that 128 hidden neurons are the optimum choice of hidden neurons with the lowest error values. This study investigates whether BiLSTM, performs better than LSTM in forecasting. Thus, the performance of these two neural networks in forecasting time series data was compared, and the Wilcoxon-Signed Rank Test was conducted. Results revealed a significant difference in the performance of these two neural networks, and BiLSTM outperformed LSTM in forecasting time series data. Hence, BiLSTM with 128 hidden neurons is encouraged to be chosen over LSTM in time series forecasting. Since these findings have implications for future practice, the combination of model and hyperparameter should be chosen wisely to obtain more accurate predictions in time series forecasting. Universiti Kebangsaan Malaysia 2025 Article PeerReviewed pdf en https://umpir.ump.edu.my/id/eprint/45978/1/2025%20Optimising%20LSTM%20And%20BILSTM%20Models%20For%20Time%20Series%20Forecasting%20Through%20Hyperparameter%20Tuning%20%20.pdf Nur Haizum, Abd Rahman and Yin, Quay Pin and Hani Syahida, Zulkafli (2025) Optimising LSTM and BiLSTM models for time series forecasting through hyperparameter tuning. Journal of Quality Measurement and Analysis, 21 (3). pp. 191-205. ISSN 1823-5670. (Published) https://doi.org/10.17576/jqma.2103.2025.12 https://doi.org/10.17576/jqma.2103.2025.12 https://doi.org/10.17576/jqma.2103.2025.12
spellingShingle HA Statistics
QA Mathematics
Nur Haizum, Abd Rahman
Yin, Quay Pin
Hani Syahida, Zulkafli
Optimising LSTM and BiLSTM models for time series forecasting through hyperparameter tuning
title Optimising LSTM and BiLSTM models for time series forecasting through hyperparameter tuning
title_full Optimising LSTM and BiLSTM models for time series forecasting through hyperparameter tuning
title_fullStr Optimising LSTM and BiLSTM models for time series forecasting through hyperparameter tuning
title_full_unstemmed Optimising LSTM and BiLSTM models for time series forecasting through hyperparameter tuning
title_short Optimising LSTM and BiLSTM models for time series forecasting through hyperparameter tuning
title_sort optimising lstm and bilstm models for time series forecasting through hyperparameter tuning
topic HA Statistics
QA Mathematics
url https://umpir.ump.edu.my/id/eprint/45978/
https://umpir.ump.edu.my/id/eprint/45978/
https://umpir.ump.edu.my/id/eprint/45978/