The sparse linear programming(SLP) is a linear programming problem equipped with a sparsity constraint, which is nonconvex, discontinuous and generally NP-hard due to the combinatorial property involved.In this paper,...The sparse linear programming(SLP) is a linear programming problem equipped with a sparsity constraint, which is nonconvex, discontinuous and generally NP-hard due to the combinatorial property involved.In this paper, by rewriting the sparsity constraint into a disjunctive form, we present an explicit formula of the Lagrangian dual problem for the SLP, in terms of an unconstrained piecewise-linear convex programming problem which admits a strong duality under bi-dual sparsity consistency. Furthermore, we show a saddle point theorem based on the strong duality and analyze two classes of stationary points for the saddle point problem. At last,we extend these results to SLP with the lower bound zero replaced by a certain negative constant.展开更多
When the coordinates of a set of points are known, the pairwise Euclidean distances among the points can be easily computed. Conversely, if the Euclidean distance matrix is given, a set of coordinates for those points...When the coordinates of a set of points are known, the pairwise Euclidean distances among the points can be easily computed. Conversely, if the Euclidean distance matrix is given, a set of coordinates for those points can be computed through the well known classical Multi-Dimensional Scaling (MDS). In this paper, we consider the case where some of the distances are far from being accurate (containing large noises or even missing). In such a situation, the order of the known distances (i.e., some distances are larger than others) is valuable information that often yields far more accurate construction of the points than just using the magnitude of the known distances. The methods making use of the order information is collectively known as nonmetric MDS. A challenging computational issue among all existing nonmetric MDS methods is that there are often a large number of ordinal constraints. In this paper, we cast this problem as a matrix optimization problem with ordinal constraints. We then adapt an existing smoothing Newton method to our matrix problem. Extensive numerical results demonstrate the efficiency of the algorithm, which can potentially handle a very large number of ordinal constraints.展开更多
基金supported by National Natural Science Foundation of China(Grant Nos.11431002,11771038 and 11728101)the State Key Laboratory of Rail Traffic Control and Safety,Beijing Jiaotong University(Grant No.RCS2017ZJ001)China Scholarship Council(Grant No.201707090019)
文摘The sparse linear programming(SLP) is a linear programming problem equipped with a sparsity constraint, which is nonconvex, discontinuous and generally NP-hard due to the combinatorial property involved.In this paper, by rewriting the sparsity constraint into a disjunctive form, we present an explicit formula of the Lagrangian dual problem for the SLP, in terms of an unconstrained piecewise-linear convex programming problem which admits a strong duality under bi-dual sparsity consistency. Furthermore, we show a saddle point theorem based on the strong duality and analyze two classes of stationary points for the saddle point problem. At last,we extend these results to SLP with the lower bound zero replaced by a certain negative constant.
文摘When the coordinates of a set of points are known, the pairwise Euclidean distances among the points can be easily computed. Conversely, if the Euclidean distance matrix is given, a set of coordinates for those points can be computed through the well known classical Multi-Dimensional Scaling (MDS). In this paper, we consider the case where some of the distances are far from being accurate (containing large noises or even missing). In such a situation, the order of the known distances (i.e., some distances are larger than others) is valuable information that often yields far more accurate construction of the points than just using the magnitude of the known distances. The methods making use of the order information is collectively known as nonmetric MDS. A challenging computational issue among all existing nonmetric MDS methods is that there are often a large number of ordinal constraints. In this paper, we cast this problem as a matrix optimization problem with ordinal constraints. We then adapt an existing smoothing Newton method to our matrix problem. Extensive numerical results demonstrate the efficiency of the algorithm, which can potentially handle a very large number of ordinal constraints.