scholarly journals Proximal point nonlinear rescaling method for convex optimization

2011 ◽  
Vol 1 (2) ◽  
pp. 283-299 ◽  
Author(s):  
Igor Griva ◽  
◽  
Roman A. Polyak ◽  
1997 ◽  
Vol 2 (1-2) ◽  
pp. 97-120 ◽  
Author(s):  
Y. I. Alber ◽  
R. S. Burachik ◽  
A. N. Iusem

In this paper we show the weak convergence and stability of the proximal point method when applied to the constrained convex optimization problem in uniformly convex and uniformly smooth Banach spaces. In addition, we establish a nonasymptotic estimate of convergence rate of the sequence of functional values for the unconstrained case. This estimate depends on a geometric characteristic of the dual Banach space, namely its modulus of convexity. We apply a new technique which includes Banach space geometry, estimates of duality mappings, nonstandard Lyapunov functionals and generalized projection operators in Banach spaces.


2014 ◽  
Vol 2014 ◽  
pp. 1-8
Author(s):  
Yu-hua Zeng ◽  
Yu-fei Yang ◽  
Zheng Peng

We propose an appealing line-search-based partial proximal alternating directions (LSPPAD) method for solving a class of separable convex optimization problems. These problems under consideration are common in practice. The proposed method solves two subproblems at each iteration: one is solved by a proximal point method, while the proximal term is absent from the other. Both subproblems admit inexact solutions. A line search technique is used to guarantee the convergence. The convergence of the LSPPAD method is established under some suitable conditions. The advantage of the proposed method is that it provides the tractability of the subproblem in which the proximal term is absent. Numerical tests show that the LSPPAD method has better performance compared with the existing alternating projection based prediction-correction (APBPC) method if both are employed to solve the described problem.


1997 ◽  
Vol 76 (2) ◽  
pp. 265-284 ◽  
Author(s):  
Roman Polyak ◽  
Marc Teboulle

Sign in / Sign up

Export Citation Format

Share Document