An Augmented Lagrangian Method for l2,1-Norm Minimization Problems in Machine Learning

2014 ◽  
Author(s):  
Liu Shulun ◽  
Li Jie
2021 ◽  
Vol 0 (0) ◽  
pp. 0
Author(s):  
Fatemeh Bazikar ◽  
Saeed Ketabchi ◽  
Hossein Moosaei

<p style='text-indent:20px;'>In this paper, we propose a method for solving the twin bounded support vector machine (TBSVM) for the binary classification. To do so, we use the augmented Lagrangian (AL) optimization method and smoothing technique, to obtain new unconstrained smooth minimization problems for TBSVM classifiers. At first, the augmented Lagrangian method is recruited to convert TBSVM into unconstrained minimization programming problems called as AL-TBSVM. We attempt to solve the primal programming problems of AL-TBSVM by converting them into smooth unconstrained minimization problems. Then, the smooth reformulations of AL-TBSVM, which we called AL-STBSVM, are solved by the well-known Newton's algorithm. Finally, experimental results on artificial and several University of California Irvine (UCI) benchmark data sets are provided along with the statistical analysis to show the superior performance of our method in terms of classification accuracy and learning speed.</p>


Author(s):  
Joachim Giesen ◽  
Soeren Laue

Many machine learning methods entail minimizing a loss-function that is the sum of the losses for each data point. The form of the loss function is exploited algorithmically, for instance in stochastic gradient descent (SGD) and in the alternating direction method of multipliers (ADMM). However, there are also machine learning methods where the entailed optimization problem features the data points not in the objective function but in the form of constraints, typically one constraint per data point. Here, we address the problem of solving convex optimization problems with many convex constraints. Our approach is an extension of ADMM. The straightforward implementation of ADMM for solving constrained optimization problems in a distributed fashion solves constrained subproblems on different compute nodes that are aggregated until a consensus solution is reached. Hence, the straightforward approach has three nested loops: one for reaching consensus, one for the constraints, and one for the unconstrained problems. Here, we show that solving the costly constrained subproblems can be avoided. In our approach, we combine the ability of ADMM to solve convex optimization problems in a distributed setting with the ability of the augmented Lagrangian method to solve constrained optimization problems. Consequently, our algorithm only needs two nested loops. We prove that it inherits the convergence guarantees of both ADMM and the augmented Lagrangian method. Experimental results corroborate our theoretical findings.


2020 ◽  
Vol 14 ◽  
pp. 174830262097353
Author(s):  
Noppadol Chumchob ◽  
Ke Chen

Variational methods for image registration basically involve a regularizer to ensure that the resulting well-posed problem admits a solution. Different choices of regularizers lead to different deformations. On one hand, the conventional regularizers, such as the elastic, diffusion and curvature regularizers, are able to generate globally smooth deformations and generally useful for many applications. On the other hand, these regularizers become poor in some applications where discontinuities or steep gradients in the deformations are required. As is well-known, the total (TV) variation regularizer is more appropriate to preserve discontinuities of the deformations. However, it is difficult in developing an efficient numerical method to ensure that numerical solutions satisfy this requirement because of the non-differentiability and non-linearity of the TV regularizer. In this work we focus on computational challenges arising in approximately solving TV-based image registration model. Motivated by many efficient numerical algorithms in image restoration, we propose to use augmented Lagrangian method (ALM). At each iteration, the computation of our ALM requires to solve two subproblems. On one hand for the first subproblem, it is impossible to obtain exact solution. On the other hand for the second subproblem, it has a closed-form solution. To this end, we propose an efficient nonlinear multigrid (NMG) method to obtain an approximate solution to the first subproblem. Numerical results on real medical images not only confirm that our proposed ALM is more computationally efficient than some existing methods, but also that the proposed ALM delivers the accurate registration results with the desired property of the constructed deformations in a reasonable number of iterations.


Sign in / Sign up

Export Citation Format

Share Document