Research on Mining Maximum Subsidence Prediction Based on Genetic Algorithm Combined with XGBoost Model

28Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.

Abstract

The extreme gradient boosting (XGBoost) ensemble learning algorithm excels in solving complex nonlinear relational problems. In order to accurately predict the surface subsidence caused by mining, this work introduces the genetic algorithm (GA) and XGBoost integrated algorithm model for mining subsidence prediction and uses the Python language to develop the GA-XGBoost combined model. The hyperparameter vector of XGBoost is optimized by a genetic algorithm to improve the prediction accuracy and reliability of the XGBoost model. Using some domestic mining subsidence data sets to conduct a model prediction evaluation, the results show that the R2 (coefficient of determination) of the prediction results of the GA-XGBoost model is 0.941, the RMSE (root mean square error) is 0.369, and the MAE (mean absolute error) is 0.308. Then, compared with classic ensemble learning models such as XGBoost, random deep forest, and gradient boost, the GA-XGBoost model has higher prediction accuracy and performance than a single machine learning model.

Cite

CITATION STYLE

APA

Gu, Z., Cao, M., Wang, C., Yu, N., & Qing, H. (2022). Research on Mining Maximum Subsidence Prediction Based on Genetic Algorithm Combined with XGBoost Model. Sustainability (Switzerland), 14(16). https://doi.org/10.3390/su141610421

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free