摘要
为准确、可靠地预测安徽省的年降水量,基于安徽省1900~2009年的均一化降水量数据集,使用信号分析技术和机器学习方法建立区域年降水量预测模型。Morlet小波分析和EEMD结果显示,研究区域历史年降水量序列大致存在3、5、20年左右的周期。为提高模型精度,建立5种输入层为3个节点、输出层为1个节点的机器学习模型,即BPNN、WANN、TSNN、SVM、ELM。按4∶1原则,将整理好的样本集中的前85组作为模型训练集,后22组作为测试集。结果表明,5种模型表现较好,率定期的平均相对误差分别为6.1%、12.1%、14.3%、14.3%、13.2%;验证期的平均相对误差为20.6%、13.6%、12.5%、13.0%、14.3%,合格率分别为63.7%、72.7%、77.3%、77.3%、72.7%。总体来看,除BPNN模型外,其余模型均较理想,机器学习方法在非线性水文序列的模拟和预测中具有较好的适用性和可靠性。研究成果可为安徽省未来水资源规划、配置提供指导。
Based on the uniform precipitation dataset of Anhui from 1900 to 2009,signal analysis technology and machine learning were used to set up model for prediction of regional annual precipitation effectively.The results obtained by the Morlet wavelet and ensemble empirical mode decomposition show that the historical annual precipitation sequence has a cycle of about 3,5,20 years.In order to improve the accuracy of the model,five machine learning models with input layer of 3 nodes and output layer of 1 node were set up,namely,BPNN,WANN,TSNN,SVM and ELM.According to the 4:1 principle,the first 85 groups samples were taken as the training set,and the last 22 groups sample were chosen as the test set.The study shows that five models perform well,and the average relative error of the test period is 6.1%,12.1%,14.3%,14.3%,13.2%,respectively;The average relative error in the validation period is 20.6%,13.6%,12.5%,13.0%,14.3%,and the qualified rate is 63.7%,72.7%,77.3%,77.3%,72.7%,respectively.In general,all the models are good except the BPNN model.The machine learning has good application and reliability in the simulation and prediction of nonlinear hydrologic time series.The prediction results provide guidance for the future water resources planning and allocation in Anhui Province.
作者
杜懿
龙铠豪
王大洋
王大刚
DU Yi;LONG Kai-hao;WANG Da-yang;WANG Da-gang(School of Geography and Planning,Sun Yat-sen University,Guangzhou 510275,China)
出处
《水电能源科学》
北大核心
2020年第7期5-7,41,共4页
Water Resources and Power
基金
国家自然科学基金项目(51779278)。
关键词
降水量
预测
小波分析
集合经验模态分解
机器学习
precipitation
prediction
wavelet analysis
ensemble empirical mode decomposition
machine learning