摘要
机器学习的主要目的是让计算机系统具有类似于人的学习能力,而数值优化方法对提高其效率,增强其效果有着举足轻重的作用。在L1-SVM优化问题中,可以利用截断Hinge损失剔除过多的支持向量,提高模型的鲁棒性。但却导致了棘手的非凸优化问题。MM(Majorization-Minimization,MM)是一种求解非凸问题的有效框架,主要思想是通过寻找一系列恰当的凸上界,将非凸目标函数转化为一系列凸的子问题进行求解。常用于求解非凸问题的凸凹转化算法(Con⁃cave-Convex Procedure,CCCP)同属这一框架。论文分析了求解截断L1-SVM问题的CCCP算法具有稀疏支持向量的原因,并在此基础上,利用多阶段策略的优点,提出一种多阶段MM方法,得到了更好的稀疏性。最后在大规模数据集上,进行了实验对比,验证了所提算法的有效性。
The main purpose of machine learning is to make computer systems similar to human learning,and numerical optimization methods play an important role in improving its efficiency and enhancing its effects.In the L1-SVM optimization problem,the support vector can be removed by using the truncated Hinge loss to improve the robustness of the model.But it has led to a difficult non-convex optimization problem.MM(Majorization-Minimization,MM)is an effective framework for solving non-convex problems.The main idea is to solve the problem by transforming a non-convex objective function into a series of convex sub-problems by finding a series of appropriate convex upper bounds.The concave-convex procedure(CCCP),which is often used to solve non-convex problems,belongs to this framework.This paper analyzes the reason why the CCCP algorithm for solving the truncated L1-SVM problem has sparse support vectors.Based on the advantages of the multi-stage strategy,a multi-stage MM method is proposed,and better sparsity is obtained.Finally,on the large-scale dataset,the experimental comparison is carried out to verify the effectiveness of the proposed algorithm.
作者
袁友宏
周凯
YUAN Youhong;ZHOU Kai(Army Academy of Artillery and Air Defense,Hefei 230031)
出处
《计算机与数字工程》
2021年第9期1847-1851,共5页
Computer & Digital Engineering