In this paper,we study smoothing approximations for some piecewise smooth functions.We first present two approaches for one-dimensional case:a global approach is to construct smoothing approximations over the whole do...In this paper,we study smoothing approximations for some piecewise smooth functions.We first present two approaches for one-dimensional case:a global approach is to construct smoothing approximations over the whole domain and a local approach is to construct smoothing approximations within appropriate neighborhoods of the nonsmooth points.We obtain some error estimate results for both approaches and discuss whether the smoothing approximations can inherit the convexity of the original functions.Furthermore,we extend the global approach to some multiple dimensional cases.展开更多
In this paper, we first give a smoothing approximation function of nonsmooth system based on box constrained variational inequalities and then present a new smoothing approximation algorithm. Under suitable conditions...In this paper, we first give a smoothing approximation function of nonsmooth system based on box constrained variational inequalities and then present a new smoothing approximation algorithm. Under suitable conditions,we show that the method is globally and superlinearly convergent. A few numerical results are also reported in the paper.展开更多
To reduce the influences of outliers on support vector machine(SVM) classification problem,a new tangent loss function was constructed.Since the tangent loss function was not smooth in some interval,a smoothing functi...To reduce the influences of outliers on support vector machine(SVM) classification problem,a new tangent loss function was constructed.Since the tangent loss function was not smooth in some interval,a smoothing function was used to approximate it in this interval.According to this loss function,the corresponding tangent SVM(TSVM) was got.The experimental results show that TSVM is less sensitive to outliers than SVM.So the proposed new loss function and TSVM are both effective.展开更多
In order to improve the learning speed and reduce computational complexity of twin support vector hypersphere(TSVH),this paper presents a smoothed twin support vector hypersphere(STSVH)based on the smoothing technique...In order to improve the learning speed and reduce computational complexity of twin support vector hypersphere(TSVH),this paper presents a smoothed twin support vector hypersphere(STSVH)based on the smoothing technique.STSVH can generate two hyperspheres with each one covering as many samples as possible from the same class respectively.Additionally,STSVH only solves a pair of unconstraint differentiable quadratic programming problems(QPPs)rather than a pair of constraint dual QPPs which makes STSVH faster than the TSVH.By considering the differentiable characteristics of STSVH,a fast Newton-Armijo algorithm is used for solving STSVH.Numerical experiment results on normally distributed clustered datasets(NDC)as well as University of California Irvine(UCI)data sets indicate that the significant advantages of the proposed STSVH in terms of efficiency and generalization performance.展开更多
We propose a smoothing trust region filter algorithm for nonsmooth nonconvex least squares problems. We present convergence theorems of the proposed algorithm to a Clarke stationary point or a global minimizer of the ...We propose a smoothing trust region filter algorithm for nonsmooth nonconvex least squares problems. We present convergence theorems of the proposed algorithm to a Clarke stationary point or a global minimizer of the objective function under certain conditions. Preliminary numerical experiments show the efficiency of the proposed algorithm for finding zeros of a system of polynomial equations with high degrees on the sphere and solving differential variational inequalities.展开更多
In this paper,the authors propose a novel smoothing descent type algorithm with extrapolation for solving a class of constrained nonsmooth and nonconvex problems,where the nonconvex term is possibly nonsmooth.Their al...In this paper,the authors propose a novel smoothing descent type algorithm with extrapolation for solving a class of constrained nonsmooth and nonconvex problems,where the nonconvex term is possibly nonsmooth.Their algorithm adopts the proximal gradient algorithm with extrapolation and a safe-guarding policy to minimize the smoothed objective function for better practical and theoretical performance.Moreover,the algorithm uses a easily checking rule to update the smoothing parameter to ensure that any accumulation point of the generated sequence is an(afne-scaled)Clarke stationary point of the original nonsmooth and nonconvex problem.Their experimental results indicate the effectiveness of the proposed algorithm.展开更多
We consider the problem of minimizing the average of a large number of smooth component functions over one smooth inequality constraint.We propose and analyze a stochastic Moving Balls Approximation(SMBA)method.Like s...We consider the problem of minimizing the average of a large number of smooth component functions over one smooth inequality constraint.We propose and analyze a stochastic Moving Balls Approximation(SMBA)method.Like stochastic gradient(SG)met hods,the SMBA method's iteration cost is independent of the number of component functions and by exploiting the smoothness of the constraint function,our method can be easily implemented.Theoretical and computational properties of SMBA are studied,and convergence results are established.Numerical experiments indicate that our algorithm dramatically outperforms the existing Moving Balls Approximation algorithm(MBA)for the structure of our problem.展开更多
In this paper,we consider the problem of computing the smallest enclosing ball(SEB)of a set of m balls in Rn,where the product mn is large.We first approximate the non-differentiable SEB problem by its log-exponentia...In this paper,we consider the problem of computing the smallest enclosing ball(SEB)of a set of m balls in Rn,where the product mn is large.We first approximate the non-differentiable SEB problem by its log-exponential aggregation function and then propose a computationally efficient inexact Newton-CG algorithm for the smoothing approximation problem by exploiting its special(approximate)sparsity structure.The key difference between the proposed inexact Newton-CG algorithm and the classical Newton-CG algorithm is that the gradient and the Hessian-vector product are inexactly computed in the proposed algorithm,which makes it capable of solving the large-scale SEB problem.We give an adaptive criterion of inexactly computing the gradient/Hessian and establish global convergence of the proposed algorithm.We illustrate the efficiency of the proposed algorithm by using the classical Newton-CG algorithm as well as the algorithm from Zhou et al.(Comput Optim Appl 30:147–160,2005)as benchmarks.展开更多
基金This work was supported in part by the National Natural Science Foundation of China(No.11431004)the Innovation Program of Shanghai Municipal Education Commission.
文摘In this paper,we study smoothing approximations for some piecewise smooth functions.We first present two approaches for one-dimensional case:a global approach is to construct smoothing approximations over the whole domain and a local approach is to construct smoothing approximations within appropriate neighborhoods of the nonsmooth points.We obtain some error estimate results for both approaches and discuss whether the smoothing approximations can inherit the convexity of the original functions.Furthermore,we extend the global approach to some multiple dimensional cases.
文摘In this paper, we first give a smoothing approximation function of nonsmooth system based on box constrained variational inequalities and then present a new smoothing approximation algorithm. Under suitable conditions,we show that the method is globally and superlinearly convergent. A few numerical results are also reported in the paper.
基金National Natural Science Foundations of China(Nos.61272015,11201123)the Scientific Research Foundation for the Doctor of Henan University of Science&Technology,China(No.09001476)School Foundation of Henan University of Science&Technology,China(No.2012QN011)
文摘To reduce the influences of outliers on support vector machine(SVM) classification problem,a new tangent loss function was constructed.Since the tangent loss function was not smooth in some interval,a smoothing function was used to approximate it in this interval.According to this loss function,the corresponding tangent SVM(TSVM) was got.The experimental results show that TSVM is less sensitive to outliers than SVM.So the proposed new loss function and TSVM are both effective.
基金This work was supported by the National Natural Science Foundation of China(51875457)the Key Research Project of Shanxi Province(2019GY-061)the International S&T Cooperation Program of Shanxi Province(2019KW-056)。
文摘In order to improve the learning speed and reduce computational complexity of twin support vector hypersphere(TSVH),this paper presents a smoothed twin support vector hypersphere(STSVH)based on the smoothing technique.STSVH can generate two hyperspheres with each one covering as many samples as possible from the same class respectively.Additionally,STSVH only solves a pair of unconstraint differentiable quadratic programming problems(QPPs)rather than a pair of constraint dual QPPs which makes STSVH faster than the TSVH.By considering the differentiable characteristics of STSVH,a fast Newton-Armijo algorithm is used for solving STSVH.Numerical experiment results on normally distributed clustered datasets(NDC)as well as University of California Irvine(UCI)data sets indicate that the significant advantages of the proposed STSVH in terms of efficiency and generalization performance.
基金supported by Hong Kong Research Grant Council(Grant No.Poly U5001/12p)National Natural Science Foundation of China(Grant No.11101231)
文摘We propose a smoothing trust region filter algorithm for nonsmooth nonconvex least squares problems. We present convergence theorems of the proposed algorithm to a Clarke stationary point or a global minimizer of the objective function under certain conditions. Preliminary numerical experiments show the efficiency of the proposed algorithm for finding zeros of a system of polynomial equations with high degrees on the sphere and solving differential variational inequalities.
基金supported by the National Natural Science Foundation of China(No.12001144)Zhejiang Provincial Natural Science Foundation of China(No.LQ20A010007)NSF/DMS-2152961。
文摘In this paper,the authors propose a novel smoothing descent type algorithm with extrapolation for solving a class of constrained nonsmooth and nonconvex problems,where the nonconvex term is possibly nonsmooth.Their algorithm adopts the proximal gradient algorithm with extrapolation and a safe-guarding policy to minimize the smoothed objective function for better practical and theoretical performance.Moreover,the algorithm uses a easily checking rule to update the smoothing parameter to ensure that any accumulation point of the generated sequence is an(afne-scaled)Clarke stationary point of the original nonsmooth and nonconvex problem.Their experimental results indicate the effectiveness of the proposed algorithm.
文摘We consider the problem of minimizing the average of a large number of smooth component functions over one smooth inequality constraint.We propose and analyze a stochastic Moving Balls Approximation(SMBA)method.Like stochastic gradient(SG)met hods,the SMBA method's iteration cost is independent of the number of component functions and by exploiting the smoothness of the constraint function,our method can be easily implemented.Theoretical and computational properties of SMBA are studied,and convergence results are established.Numerical experiments indicate that our algorithm dramatically outperforms the existing Moving Balls Approximation algorithm(MBA)for the structure of our problem.
基金the National Natural Science Foundation of China(Nos.11331012 and 11301516).
文摘In this paper,we consider the problem of computing the smallest enclosing ball(SEB)of a set of m balls in Rn,where the product mn is large.We first approximate the non-differentiable SEB problem by its log-exponential aggregation function and then propose a computationally efficient inexact Newton-CG algorithm for the smoothing approximation problem by exploiting its special(approximate)sparsity structure.The key difference between the proposed inexact Newton-CG algorithm and the classical Newton-CG algorithm is that the gradient and the Hessian-vector product are inexactly computed in the proposed algorithm,which makes it capable of solving the large-scale SEB problem.We give an adaptive criterion of inexactly computing the gradient/Hessian and establish global convergence of the proposed algorithm.We illustrate the efficiency of the proposed algorithm by using the classical Newton-CG algorithm as well as the algorithm from Zhou et al.(Comput Optim Appl 30:147–160,2005)as benchmarks.