期刊文献+

一种关于有效步长约束的自适应算法

An Adaptive Algorithm on Effective Step Size Constraints
下载PDF
导出
摘要 鉴于Adam算法在迭代后期因有效步长过大而导致算法的收敛性能下降,本研究提出了一种名为MAXGrad的优化算法。MAXGrad通过修改二阶矩的迭代公式以限制有效步长的增长。为深入评估MAXGrad算法的实际应用和性能,本文扩展了实验范围,采用三个较大规模的数据集,并与SGDM、Adam以及AMSGrad等算法进行了详细比较。实验结果清晰表明,在多个数据集上,MAXGrad算法相对于Adam和AMSGrad等自适应算法均取得了显著的性能改进。这些结果充分验证了MAXGrad算法作为一种全新的有效步长迭代算法的可行性和卓越性能。 In view of the fact that the convergence performance of Adam’s algorithm is degraded at the later stage of iteration due to the excessively large effective step size, an optimization algorithm named MAXGrad is proposed in this study. MAXGrad limits the growth of the effective step size by modify-ing the iterative formulation of the second-order moments. In order to evaluate the practical appli-cation and performance of the MAXGrad algorithm in depth, this paper extends the experimental scope by using three larger-scale datasets and compares them in detail with the algorithms of SGDM, Adam, and AMSGrad. The experimental results clearly show that the MAXGrad algorithm achieves significant performance improvements over adaptive algorithms such as Adam and AMSGrad on multiple datasets. These results fully validate the feasibility and superior performance of the MAXGrad algorithm as a new effective step-size iterative algorithm.
出处 《应用数学进展》 2023年第10期4248-4254,共7页 Advances in Applied Mathematics
  • 相关文献

参考文献3

二级参考文献25

共引文献74

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部