期刊文献+
共找到361篇文章
< 1 2 19 >
每页显示 20 50 100
GLOBAL CONVERGENCE RESULTS OF A THREE TERM MEMORY GRADIENT METHOD WITH A NON-MONOTONE LINE SEARCH TECHNIQUE 被引量:12
1
作者 孙清滢 《Acta Mathematica Scientia》 SCIE CSCD 2005年第1期170-178,共9页
In this paper, a new class of three term memory gradient method with non-monotone line search technique for unconstrained optimization is presented. Global convergence properties of the new methods are discussed. Comb... In this paper, a new class of three term memory gradient method with non-monotone line search technique for unconstrained optimization is presented. Global convergence properties of the new methods are discussed. Combining the quasi-Newton method with the new method, the former is modified to have global convergence property. Numerical results show that the new algorithm is efficient. 展开更多
关键词 Non-linear programming three term memory gradient method convergence non-monotone line search technique numerical experiment
下载PDF
ON THE GLOBAL CONVERGENCE OF CONJUGATE GRADIENT METHODS WITH INEXACT LINESEARCH
2
作者 刘光辉 韩继业 《Numerical Mathematics A Journal of Chinese Universities(English Series)》 SCIE 1995年第2期147-153,共7页
In this paper we consider the global convergence of any conjugate gradient method of the form d1=-g1,dk+1=-gk+1+βkdk(k≥1)with any βk satisfying sume conditions,and with the strong wolfe line search conditions.Under... In this paper we consider the global convergence of any conjugate gradient method of the form d1=-g1,dk+1=-gk+1+βkdk(k≥1)with any βk satisfying sume conditions,and with the strong wolfe line search conditions.Under the convex assumption on the objective function,we preve the descenf property and the global convergence of this method. 展开更多
关键词 CONJUGATE gradient method STRONG Wolfe line search global convergence.
下载PDF
GLOBAL CONVERGENCE OF THE GENERAL THREE TERM CONJUGATE GRADIENT METHODS WITH THE RELAXED STRONG WOLFE LINE SEARCH
3
作者 Xu Zeshui Yue ZhenjunInstitute of Sciences,PLA University of Science and Technology,Nanjing,210016. 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2001年第1期58-62,共5页
The global convergence of the general three term conjugate gradient methods with the relaxed strong Wolfe line search is proved.
关键词 Conjugate gradient method inexact line search global convergence.
下载PDF
A Scaled Conjugate Gradient Method Based on New BFGS Secant Equation with Modified Nonmonotone Line Search
4
作者 Tsegay Giday Woldu Haibin Zhang Yemane Hailu Fissuh 《American Journal of Computational Mathematics》 2020年第1期1-22,共22页
In this paper, we provide and analyze a new scaled conjugate gradient method and its performance, based on the modified secant equation of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method and on a new modified nonmo... In this paper, we provide and analyze a new scaled conjugate gradient method and its performance, based on the modified secant equation of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method and on a new modified nonmonotone line search technique. The method incorporates the modified BFGS secant equation in an effort to include the second order information of the objective function. The new secant equation has both gradient and function value information, and its update formula inherits the positive definiteness of Hessian approximation for general convex function. In order to improve the likelihood of finding a global optimal solution, we introduce a new modified nonmonotone line search technique. It is shown that, for nonsmooth convex problems, the proposed algorithm is globally convergent. Numerical results show that this new scaled conjugate gradient algorithm is promising and efficient for solving not only convex but also some large scale nonsmooth nonconvex problems in the sense of the Dolan-Moré performance profiles. 展开更多
关键词 Conjugate gradient method BFGS method MODIFIED SECANT EQUATION NONMONOTONE line search Nonsmooth Optimization
下载PDF
A New Two-Parameter Family of Nonlinear Conjugate Gradient Method Without Line Search for Unconstrained Optimization Problem
5
作者 ZHU Tiefeng 《Wuhan University Journal of Natural Sciences》 CAS CSCD 2024年第5期403-411,共9页
This paper puts forward a two-parameter family of nonlinear conjugate gradient(CG)method without line search for solving unconstrained optimization problem.The main feature of this method is that it does not rely on a... This paper puts forward a two-parameter family of nonlinear conjugate gradient(CG)method without line search for solving unconstrained optimization problem.The main feature of this method is that it does not rely on any line search and only requires a simple step size formula to always generate a sufficient descent direction.Under certain assumptions,the proposed method is proved to possess global convergence.Finally,our method is compared with other potential methods.A large number of numerical experiments show that our method is more competitive and effective. 展开更多
关键词 unconstrained optimization conjugate gradient method without line search global convergence
原文传递
A New Class of Nonlinear Conjugate Gradient Methods with Global Convergence Properties 被引量:1
6
作者 陈忠 《长江大学学报(自科版)(上旬)》 CAS 2014年第3期I0001-I0003,共3页
非线性共轭梯度法由于其迭代简单和储存量小,且搜索方向不需要满足正割条件,在求解大规模无约束优化问题时占据及其重要的地位.提出了一类新的共轭梯度法,其搜索方向是目标函数的下降方向.若假设目标函数连续可微且梯度满足Lipschitz条... 非线性共轭梯度法由于其迭代简单和储存量小,且搜索方向不需要满足正割条件,在求解大规模无约束优化问题时占据及其重要的地位.提出了一类新的共轭梯度法,其搜索方向是目标函数的下降方向.若假设目标函数连续可微且梯度满足Lipschitz条件,线性搜索满足Wolfe原则,讨论了所设计算法的全局收敛性. 展开更多
关键词 摘要 编辑部 编辑工作 读者
下载PDF
A New Nonlinear Conjugate Gradient Method for Unconstrained Optimization Problems 被引量:1
7
作者 LIU Jin-kui WANG Kai-rong +1 位作者 SONG Xiao-qian DU Xiang-lin 《Chinese Quarterly Journal of Mathematics》 CSCD 2010年第3期444-450,共7页
In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wol... In this paper,an efficient conjugate gradient method is given to solve the general unconstrained optimization problems,which can guarantee the sufficient descent property and the global convergence with the strong Wolfe line search conditions.Numerical results show that the new method is efficient and stationary by comparing with PRP+ method,so it can be widely used in scientific computation. 展开更多
关键词 unconstrained optimization conjugate gradient method strong Wolfe line search sufficient descent property global convergence
下载PDF
CONVERGENCE ANALYSIS ON A CLASS OF CONJUGATE GRADIENT METHODS WITHOUTSUFFICIENT DECREASE CONDITION 被引量:1
8
作者 刘光辉 韩继业 +1 位作者 戚厚铎 徐中玲 《Acta Mathematica Scientia》 SCIE CSCD 1998年第1期11-16,共6页
Recently, Gilbert and Nocedal([3]) investigated global convergence of conjugate gradient methods related to Polak-Ribiere formular, they restricted beta(k) to non-negative value. [5] discussed the same problem as that... Recently, Gilbert and Nocedal([3]) investigated global convergence of conjugate gradient methods related to Polak-Ribiere formular, they restricted beta(k) to non-negative value. [5] discussed the same problem as that in [3] and relaxed beta(k) to be negative with the objective function being convex. This paper allows beta(k) to be selected in a wider range than [5]. Especially, the global convergence of the corresponding algorithm without sufficient decrease condition is proved. 展开更多
关键词 Polak-Ribiere conjugate gradient method strong Wolfe line search global convergence
全文增补中
A CLASSOF NONMONOTONE CONJUGATE GRADIENT METHODSFOR NONCONVEX FUNCTIONS
9
作者 LiuYun WeiZengxin 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2002年第2期208-214,共7页
This paper discusses the global convergence of a class of nonmonotone conjugate gra- dient methods(NM methods) for nonconvex object functions.This class of methods includes the nonmonotone counterpart of modified Po... This paper discusses the global convergence of a class of nonmonotone conjugate gra- dient methods(NM methods) for nonconvex object functions.This class of methods includes the nonmonotone counterpart of modified Polak- Ribière method and modified Hestenes- Stiefel method as special cases 展开更多
关键词 nonmonotone conjugate gradient method nonmonotone line search global convergence unconstrained optimization.
下载PDF
A hybrid conjugate gradient method for optimization problems
10
作者 Xiangrong Li Xupei Zhao 《Natural Science》 2011年第1期85-90,共6页
A hybrid method of the Polak-Ribière-Polyak (PRP) method and the Wei-Yao-Liu (WYL) method is proposed for unconstrained optimization pro- blems, which possesses the following properties: i) This method inherits a... A hybrid method of the Polak-Ribière-Polyak (PRP) method and the Wei-Yao-Liu (WYL) method is proposed for unconstrained optimization pro- blems, which possesses the following properties: i) This method inherits an important property of the well known PRP method: the tendency to turn towards the steepest descent direction if a small step is generated away from the solution, preventing a sequence of tiny steps from happening;ii) The scalar holds automatically;iii) The global convergence with some line search rule is established for nonconvex functions. Numerical results show that the method is effective for the test problems. 展开更多
关键词 line search UNCONSTRAINED Optimization CONJUGATE gradient method Global CONVERGENCE
下载PDF
A Descent Gradient Method and Its Global Convergence
11
作者 LIU Jin-kui 《Chinese Quarterly Journal of Mathematics》 CSCD 2014年第1期142-150,共9页
Y Liu and C Storey(1992)proposed the famous LS conjugate gradient method which has good numerical results.However,the LS method has very weak convergence under the Wolfe-type line search.In this paper,we give a new de... Y Liu and C Storey(1992)proposed the famous LS conjugate gradient method which has good numerical results.However,the LS method has very weak convergence under the Wolfe-type line search.In this paper,we give a new descent gradient method based on the LS method.It can guarantee the sufficient descent property at each iteration and the global convergence under the strong Wolfe line search.Finally,we also present extensive preliminary numerical experiments to show the efficiency of the proposed method by comparing with the famous PRP^+method. 展开更多
关键词 unconstrained optimization conjugate gradient method strong Wolfe line search sufficient descent property global convergence
下载PDF
一种近似BFGS的自适应双参数共轭梯度法
12
作者 李向利 莫元健 梅建平 《应用数学》 北大核心 2024年第1期89-99,共11页
为了更加有效的求解大规模无约束优化问题,本文基于自调比无记忆BFGS拟牛顿法,提出一个自适应双参数共轭梯度法,设计的搜索方向满足充分下降性,在一般假设和标准Wolfe线搜索准则下,证明该方法具有全局收敛性,数值实验结果证明提出的新... 为了更加有效的求解大规模无约束优化问题,本文基于自调比无记忆BFGS拟牛顿法,提出一个自适应双参数共轭梯度法,设计的搜索方向满足充分下降性,在一般假设和标准Wolfe线搜索准则下,证明该方法具有全局收敛性,数值实验结果证明提出的新算法是有效的. 展开更多
关键词 大规模无约束优化 共轭梯度法 WOLFE线搜索 全局收敛性
下载PDF
一种WYL型谱共轭梯度法的全局收敛性 被引量:2
13
作者 蔡宇 周光辉 《数学物理学报(A辑)》 CSCD 北大核心 2024年第1期173-184,共12页
为解决大规模无约束优化问题,该文结合WYL共轭梯度法和谱共轭梯度法,给出了一种WYL型谱共轭梯度法.在不依赖于任何线搜索的条件下,该方法产生的搜索方向均满足充分下降性,且在强Wolfe线搜索下证明了该方法的全局收敛性.与WYL共轭梯度法... 为解决大规模无约束优化问题,该文结合WYL共轭梯度法和谱共轭梯度法,给出了一种WYL型谱共轭梯度法.在不依赖于任何线搜索的条件下,该方法产生的搜索方向均满足充分下降性,且在强Wolfe线搜索下证明了该方法的全局收敛性.与WYL共轭梯度法的收敛性相比,WYL型谱共轭梯度法推广了线搜索中参数σ的取值范围.最后,相应的数值结果表明了该方法是有效的. 展开更多
关键词 无约束优化 谱共轭梯度法 强Wolfe线搜索 全局收敛性
下载PDF
ANonmonotone Projected Gradient Method for Multiobjective Problems on Convex Sets
14
作者 Gabrie Anibal Carrizo Nadia Soledad Fazzio Maria Laura Schuverdt 《Journal of the Operations Research Society of China》 EI CSCD 2024年第2期410-427,共18页
In this work we consider an extension of the classical scalar-valued projected gradient method for multiobjective problems on convex sets.As in Fazzio et al.(Optim Lett 13:1365-1379,2019)a parameter which controls the... In this work we consider an extension of the classical scalar-valued projected gradient method for multiobjective problems on convex sets.As in Fazzio et al.(Optim Lett 13:1365-1379,2019)a parameter which controls the step length is considered and an updating rule based on the spectral gradient method from the scalar case is proposed.In the present paper,we consider an extension of the traditional nonmonotone approach of Grippo et al.(SIAM J Numer Anal 23:707-716,1986)based on the maximum of some previous function values as suggested in Mita et al.(J Glob Optim 75:539-559,2019)for unconstrained multiobjective optimization problems.We prove the accumulation points of sequences generated by the proposed algorithm,if they exist,are stationary points of the original problem.Numerical experiments are reported. 展开更多
关键词 Multiobjective optimization Projected gradient methods Nonmonotone line search Global convergence
原文传递
标准Wolfe线搜索下改进的HS共轭梯度法
15
作者 王森森 郑宗剑 韩信 《四川文理学院学报》 2024年第2期50-55,共6页
通过对现有的HS共轭梯度法进行修正,提出一个具有下降性质的改进型HS共轭梯度法,该算法的下降性质得到论证.在标准Wolfe线搜索条件下,证明了改进的HS算法具有全局收敛性.最后,通过数值实验结果的对比,发现新算法数值效果是优异的.
关键词 无约束优化 共轭梯度法 标准Wolfe线搜索 全局收敛性
下载PDF
基于修正割线方程的BB梯度法
16
作者 杨爽艺 《商洛学院学报》 2024年第2期22-25,共4页
将修正的割线方程和BB梯度法结合起来,从而得到一类修正的BB步长,再利用Zhang-Hager非单调线搜索,提出一个改进的BB梯度方法(MB法)。在一定的假设下,MB法是具有全局收敛性的。同时对MB法和同类型的几个BB方法进行大量的数值试验,结果表... 将修正的割线方程和BB梯度法结合起来,从而得到一类修正的BB步长,再利用Zhang-Hager非单调线搜索,提出一个改进的BB梯度方法(MB法)。在一定的假设下,MB法是具有全局收敛性的。同时对MB法和同类型的几个BB方法进行大量的数值试验,结果表明MB法的数值效果是最好的。 展开更多
关键词 Barzilai-Borwein梯度法 非单调线搜索 无约束优化 改进割线方程
下载PDF
Wolfe线搜下改进的FR型谱共轭梯度法
17
作者 王森森 韩信 吴祥标 《遵义师范学院学报》 2024年第5期80-84,共5页
谱共轭梯度法作为经典共轭梯度法的推广,它是求解大规模无约束优化问题的有效方法之一.基于标准Wolfe线搜索准则和充分下降性条件,提出了一种具有充分下降性质的FR型谱共轭梯度法.在温和的假设条件下,该算法具有全局收敛性.最后,将新算... 谱共轭梯度法作为经典共轭梯度法的推广,它是求解大规模无约束优化问题的有效方法之一.基于标准Wolfe线搜索准则和充分下降性条件,提出了一种具有充分下降性质的FR型谱共轭梯度法.在温和的假设条件下,该算法具有全局收敛性.最后,将新算法与现存的修正FR型谱共轭梯度法进行比较,数值结果表明提出的算法是极其有效的. 展开更多
关键词 无约束优化 谱共轭梯度法 充分下降性 标准Wolfe线搜索准则 全局收敛性
下载PDF
基于非下降线搜索的改进PRP共轭梯度方法及在图像恢复中的应用
18
作者 李朋原 《现代信息科技》 2024年第17期62-67,共6页
PRP方法是最有效的非线性共轭梯度优化方法之一,然而该方法不能保证产生目标函数的下降方向,这给一般函数的全局收敛带来了困难。为了保证PRP方法的全局收敛性,提出了一种改进的PRP共轭梯度方法。文章以非凸优化问题为目标,简要介绍了... PRP方法是最有效的非线性共轭梯度优化方法之一,然而该方法不能保证产生目标函数的下降方向,这给一般函数的全局收敛带来了困难。为了保证PRP方法的全局收敛性,提出了一种改进的PRP共轭梯度方法。文章以非凸优化问题为目标,简要介绍了非下降线搜索技术以及一些适当的假设条件,探讨了改进PRP方法的全局收敛性。基于MATLAB软件工具,验证了新方法在处理无约束优化和图像恢复问题时的有效性和实用性。 展开更多
关键词 共轭梯度方法 非下降线搜索 全局收敛性 无约束优化 图像修复
下载PDF
带有延迟步长的循环BB梯度法
19
作者 杨奕涵 《东莞理工学院学报》 2024年第1期1-6,共6页
梯度法是求解大规模无约束优化问题的常用方法。将求解二次函数极小化问题的步长推广至一般无约束优化问题,通过使用延迟一步以及循环梯度法的思想,提出了循环Barzilai-Borwein梯度法(BB梯度法),并结合Zhang-Hager非单调线搜索技术,给... 梯度法是求解大规模无约束优化问题的常用方法。将求解二次函数极小化问题的步长推广至一般无约束优化问题,通过使用延迟一步以及循环梯度法的思想,提出了循环Barzilai-Borwein梯度法(BB梯度法),并结合Zhang-Hager非单调线搜索技术,给出了求解一般无约束优化问题的循环BB梯度算法—CBBGM算法。在适当的假设下,CBBGM算法是全局收敛的,且目标函数为强凸函数时,该算法具有线性收敛速度。数值试验表明,与现有的方法相比,所提出的方法在计算上更高效。 展开更多
关键词 Barzilai-Borwein梯度法 无约束优化问题 Zhang-Hager非单调线搜索 全局收敛性
下载PDF
Modified LS Method for Unconstrained Optimization
20
作者 Jinkui Liu Li Zheng 《Applied Mathematics》 2011年第6期779-782,共4页
In this paper, a new conjugate gradient formula and its algorithm for solving unconstrained optimization problems are proposed. The given formula satisfies with satisfying the descent condition. Under the Grippo-Lucid... In this paper, a new conjugate gradient formula and its algorithm for solving unconstrained optimization problems are proposed. The given formula satisfies with satisfying the descent condition. Under the Grippo-Lucidi line search, the global convergence property of the given method is discussed. The numerical results show that the new method is efficient for the given test problems. 展开更多
关键词 UNCONSTRAINED Optimization CONJUGATE gradient method Grippo-Lucidi line search Global CONVERGENCE
下载PDF
上一页 1 2 19 下一页 到第
使用帮助 返回顶部