Daily Rainfall Forecasting Using Meteorology Data with Long Short-Term Memory (LSTM) Network
الموضوعات :Soo See Chai 1 , Kok Luong Goh 2
1 - Faculty of Computer Science and Information Technology, University of Malaysia Sarawak (UNIMAS), 94300, Kota Samarahan, Sarawak, Malaysia
2 - International College of Advanced Technology Sarawak (i-CATS), Kuching, Sarawak, Malaysia
الکلمات المفتاحية: NETLOGO, Social force model, crowd evacuation simulation, microscopic simulation,
ملخص المقالة :
Rainfall is a natural climatic phenomenon and prediction of its value is crucial for weather forecasting. For time series data forecasting, the Long Short-Term Memory (LSTM) network is shown to be superior as compared to other machine learning algorithms. Therefore, in this research work, a LSTM network is developed to predict daily average rainfall values using meteorological data obtained from the Malaysian Meteorological Department for Kuching, Sarawak, Malaysia. Six daily meteorology data, namely, minimum temperature (°C), maximum temperature (°C), mean temperature (°C), mean wind speed (m/s), mean sea level pressure (hPa) and mean relative humidity (%) from the year 2009 to 2013 were used as the input of the LSTM prediction model. The accuracy of the predicted daily average rainfall was assessed using coefficient determinant (R2) and Root Mean Square Error (RMSE). Contrary to the common practice of dividing the whole available data set into training, validation and testing sub-sets, the developed LSTM model in this study was applied to forecast the daily average rainfall for the month December 2013 while training was done using the data prior of this month. An analysis on the testing data showed that, the data is more spread out in the testing set as compared to the training data. As LSTM requires the right setting of hyper-parameters, an analysis on the effects of the number of maximum epochs and the mini-batch size on the rainfall prediction accuracy were carried out in this study. From the experiments, a five layers LSTM model with number of maximum epoch of 10 and mini-batch size of 100 managed to achieve the best prediction at an average RMSE of 20.67 mm and R2 = 0.82.
Kumar, D. , Singh, A., Samui, P. & Jha, R K..(2019), "Forecasting monthly precipitation using sequential modelling," Hydrological sciences journal, 64( 6), 690-700 .
Hernández, E. Sanchez-Anguix,V. Julian, V. Palanca, J. & Duque, N. (2016) "Rainfall Prediction: A Deep Learning Approach," Cham,Springer International Publishing, in Hybrid Artificial Intelligent Systems, 151-162.
Tran Anh, D., Duc Dang, T. &. Pham Van, S. (2019) "Improved Rainfall Prediction Using Combined Pre-Processing Methods and Feed-Forward Neural Networks," J — Multidisciplinary Scientific Journal, 2,(1), 65-83, [Online]. Available: https://www.mdpi.com/2571-8800/2/1/6.
Kashiwao, Nakayama, Ando, T. K., S. K. M. Lee, & Bahadori, A. (2017) "A neural network-based local rainfall prediction system using meteorological data on the Internet: A case study using data from the Japan Meteorological Agency," Applied Soft Computing, 56, 317-330.
Liu,Q., Zou,Y. Liu, X. & Linge, N. (2019) "A survey on rainfall forecasting using artificial neural network," International Journal of Embedded Systems,11,(2), 240-249.
Darji, M. P., Dabhi,V. K. & Prajapati, H. B. (2015)"Rainfall forecasting using neural network: A survey," in 2015 International Conference on Advances in Computer Engineering and Applications, IEEE, 706-713.
Mandal, T. & Jothiprakash, V. (2012) "Short-term rainfall prediction using ANN and MT techniques," ISH Journal of Hydraulic Engineering, 18(1), 20-26.
Solomatine, D. P. (2006)"Data‐driven modeling and computational intelligence methods in hydrology," Encyclopedia of hydrological sciences.
Pouyanfar S. & et al.,(2018) "A survey on deep learning: Algorithms, techniques, and applications," ACM Computing Surveys (CSUR), 51(5), 1-36, 2018.
Jia, Y., Wu, J., Ben-Akiva, Seshadri, M. R. & Du, Y. (2017) "Rainfall-integrated traffic speed prediction using deep learning method," IET Intelligent Transport Systems, 11(9), 531-536, 2017.
Nakisa, B., Rastgoo, M. N., A. Rakotonirainy, F. Maire, & Chandran,V. (2018) "Long short term memory hyperparameter optimization for a neural network based emotion recognition framework," IEEE Access,6, 49325-49338.
Akbari Asanjan, A., Yang,T. Hsu,K. Sorooshian,S., Lin, J. & Peng, Q. (2018) "Short‐term precipitation forecast based on the PERSIANN system and LSTM recurrent neural networks," Journal of Geophysical Research: Atmospheres,123(22), 12,543-12,563.
Kumar, A. Islam, T. Sekimoto, Y. Mattmann, C. & Wilson, B. (2020) "Convcast: An embedded convolutional LSTM based architecture for precipitation nowcasting using satellite data," Plos one,15(3), e0230114.
Salman, A. Heryadi, G., Abdurahman, Y. E. & Suparta,W. (2018) "Single layer & multi-layer long short-term memory (LSTM) model with intermediate variables for weather forecasting," Procedia Computer Science, 135, 89-98.
Hochreiter, S. & Schmidhuber, J.,(1997) "LSTM can solve hard long time lag problems," in Advances in neural information processing systems, 473-479.
Bengio,Y., Simard,P. & Frasconi,P., (1994) "Learning long-term dependencies with gradient descent is difficult," IEEE transactions on neural networks, 5(2),157-166.
Hochreiter, S.(1998) "The vanishing gradient problem during learning recurrent neural nets and problem solutions," International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 6,(2), 107-116.
Chen,Y., Liu, He,.S., Liu, S. K. & Zhao, J. (2016)"Event extraction via bidirectional long short-term memory tensor neural networks," in Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data: Springer, 190-203.
Sakinah, N., Tahir, Badriyah, M. T. & Syarif, I. (2019) "LSTM With Adam Optimization-Powered High Accuracy Preeclampsia Classification," in 2019 International Electronics Symposium (IES), IEEE, 314-319.
Kingma, D. P. & Ba J.(2014) "Adam: A method for stochastic optimization," arXiv preprint arXiv:1412.6980.
Jiang, S. & Chen, Y. (2017)"Hand gesture recognition by using 3DCNN and LSTM with adam optimizer," in Pacific Rim Conference on Multimedia, Springer, 743-753.
Paper, D.,(2020)"Scikit-Learn Classifier Tuning from Complex Training Sets," Hands-on Scikit-Learn for Machine Learning Applications: Data Science Fundamentals with Python, 165-188.
Brownlee, J. (2018) Better Deep Learning: Train Faster, Reduce Overfitting, and Make Better Predictions. Machine Learning Mastery.
Masters, D. & Luschi, C.(2018) "Revisiting small batch training for deep neural networks," arXiv preprint arXiv:1804.07612.