Linear convergence rate for distributed optimization with the alternating direction method of multipliers

Author(s):  
F. Iutzeler ◽  
P. Bianchi ◽  
Ph. Ciblat ◽  
W. Hachem
Author(s):  
Min Li ◽  
Zhongming Wu

In this paper, we propose an inexact majorized symmetric Gauss–Seidel (sGS) alternating direction method of multipliers (ADMM) with indefinite proximal terms for multi-block convex composite programming. This method is a specific form of the inexact majorized ADMM which is further proposed to solve a general two-block separable optimization problem. The new methods adopt certain relative error criteria to solve the involving subproblems approximately, and the step-sizes allow to choose in the scope [Formula: see text]. Under more general conditions, we establish the global convergence and Q-linear convergence rate of the proposed methods.


2021 ◽  
Vol 2021 ◽  
pp. 1-19
Author(s):  
Hansi K. Abeynanda ◽  
G. H. J. Lanel

Distributed optimization is a very important concept with applications in control theory and many related fields, as it is high fault-tolerant and extremely scalable compared with centralized optimization. Centralized solution methods are not suitable for many application domains that consist of large number of networked systems. In general, these large-scale networked systems cooperatively find an optimal solution to a common global objective during the optimization process. Thus, it gives us an opportunity to analyze distributed optimization techniques that is demanded in most distributed optimization settings. This paper presents an analysis that provides an overview of decomposition methods as well as currently existing distributed methods and techniques that are employed in large-scale networked systems. A detailed analysis on gradient like methods, subgradient methods, and methods of multipliers including the alternating direction method of multipliers is presented. These methods are analyzed empirically by using numerical examples. Moreover, an example highlighting the fact that the gradient method fails to solve distributed problems in some circumstances is discussed under numerical results. A numerical implementation is used to demonstrate that the alternating direction method of multipliers can solve this particular problem, by revealing its robustness compared with the gradient method. Finally, we conclude the paper with possible future research directions.


Author(s):  
Changjie Fang ◽  
Jingyu Chen ◽  
Shenglan Chen

In this paper, we propose an image denoising algorithm for compressed sensing based on alternating direction method of multipliers (ADMM). We prove that the objective function of the iterates approaches the optimal value. We also prove the [Formula: see text] convergence rate of our algorithm in the ergodic sense. At the same time, simulation results show that our algorithm is more efficient in image denoising compared with existing methods.


Sign in / Sign up

Export Citation Format

Share Document