| Summary: | Long Short-Term Memory (LSTM) and Bidirectional Long Short-Term Memory (BiLSTM) are the emerging Recurrent Neural Networks (RNN) widely used in time series forecasting. The performance of these neural networks relies on the selection of hyperparameters. A random selection of the hyperparameters may increase the forecasting error. Hence, this study aims to optimise the performance of LSTM and BiLSTM in time series forecasting by tuning one of the essential hyperparameters, the number of hidden neurons. LSTM and BiLSTM with 32, 64, and 128 hidden neurons and various combinations of other hyperparameters are formed in this study through grid searching. The models are evaluated and compared based on the Mean Squared Error (MSE) and Mean Absolute Error (MAE). The results from real data analysis revealed that 128 hidden neurons are the optimum choice of hidden neurons with the lowest error values. This study investigates whether BiLSTM, performs better than LSTM in forecasting. Thus, the performance of these two neural networks in forecasting time series data was compared, and the Wilcoxon-Signed Rank Test was conducted. Results revealed a significant difference in the performance of these two neural networks, and BiLSTM outperformed LSTM in forecasting time series data. Hence, BiLSTM with 128 hidden neurons is encouraged to be chosen over LSTM in time series forecasting. Since these findings have implications for future practice, the combination of model and hyperparameter should be chosen wisely to obtain more accurate predictions in time series forecasting.
|