摘要
本文在HS方法和DY方法的基础上,综合两者的优势,提出了一种求解无约束优化问题的新的混合共轭梯度法。在Wolfe线搜索下,不需给定下降条件,证明了算法的全局收敛性。数值试验表明,新算法较之HS方法和PR方法更加有效。
In this paper, we propose a mixed conjugate gradient method for unconstrained optimization based on Hestenes-stiefel Algorithms and Dai-Yuan Algorithms, which has taken the advantages of two Algorithms. We prove it can ensure the convergence under the Wolfe line search and without the descent condition. Numerical experiments show that the algorithm is efficient by comparing with HS conjugate gradient methods and PR conjugate gradient methods.
出处
《计算数学》
CSCD
北大核心
2005年第4期429-436,共8页
Mathematica Numerica Sinica
基金
国家自然科学基金(60472071)北京市教委科研基金(NO.KM200310028117)资助.
关键词
无约束最优化
共轭梯度法
WOLFE线搜索
全局收敛性
Unconstrained optimization, Conjugate gradient method, wolfe Line search, Global convergence