Effect of Gradient Descent Optimizers and Dropout Technique on Deep Learning LSTM Performance in Rainfall-runoff Modeling

Duong Tran Anh, Dat Vi Thanh, Hoang Minh Le, Bang Tran Sy, Ahad Hasan Tanim, Quoc Bao Pham, Thanh Duc Dang, Son T. Mai, Nguyen Mai Dang

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

Machine learning and deep learning (ML-DL) based models are widely used for rainfall-runoff prediction and they have potential to substitute process-oriented physics based numerical models. However, developing an ML model has also performance uncertainty because of inaccurate choices of hyperparameters and neural networks architectures. Thus, this study aims to search for best optimization algorithms to be used in ML-DL models namely, RMSprop, Adagrad, Adadelta, and Adam optimizers, as well as dropout techniques to be integrated into the Long Short Term Memory (LSTM) model to improve forecasting accuracy of rainfall-runoff modeling. A deep learning LSTMs were developed using 480 model architectures at two hydro-meteorological stations of the Mekong Delta, Vietnam, namely Chau Doc and Can Tho. The model performance is tested with the most ideally suited LSTM optimizers utilizing combinations of four dropout percentages respectively, 0%, 10%, 20%, and 30%. The Adagrad optimizer shows the best model performance in the model testing. Deep learning LSTM models with 10% dropout made the best prediction results while significantly reducing overfitting tendency of the forecasted time series. The findings of this study are valuable for ML-based hydrological models set up by identifying a suitable gradient descent (GD) optimizer and optimal dropout ratio to enhance the performance and forecasting accuracy of the ML model.

Original languageEnglish
Pages (from-to)639-657
Number of pages19
JournalWater Resources Management
Volume37
Issue number2
DOIs
StatePublished - Jan 2023
Externally publishedYes

Funding

The first author acknowledges the financial support from the Fulbright Visiting Scholar program at the University of South Florida, USA. We also thank the Southern Regional Hydro-meteorological Center and National meteorological center for providing daily rainfall and runoff data in this study.

Keywords

  • Dropout technique
  • LSTM
  • Mekong delta
  • Optimizers
  • Rainfall-runoff

Fingerprint

Dive into the research topics of 'Effect of Gradient Descent Optimizers and Dropout Technique on Deep Learning LSTM Performance in Rainfall-runoff Modeling'. Together they form a unique fingerprint.

Cite this