scholarly journals Global convergence rate of incremental aggregated gradient methods for nonsmooth problems

Author(s):  
N. Denizcan Vanli ◽  
Mert Gurbuzbalaban ◽  
Asuman Ozdaglar
2018 ◽  
Vol 28 (2) ◽  
pp. 1282-1300 ◽  
Author(s):  
N. D. Vanli ◽  
M. Gürbüzbalaban ◽  
A. Ozdaglar

Filomat ◽  
2016 ◽  
Vol 30 (12) ◽  
pp. 3243-3252 ◽  
Author(s):  
Qiao-Li Dong ◽  
Songnian He ◽  
Yanmin Zhao

In this paper, we introduce two fast projection algorithms for solving the multiple-sets split feasibility problem (MSFP). Our algorithms accelerate algorithms proposed in [8] and are proved to have a global convergence rate O(1=n2). Preliminary numerical experiments show that these algorithms are practical and promising.


2015 ◽  
Vol 23 (3) ◽  
Author(s):  
Lori Badea

AbstractIn [L. Badea, Global convergence rate of a standard multigrid method for variational inequalities, IMA J. Numer. Anal., 34 (2014), No. 1, 197-216], a global convergence rate of the standard monotone multigrid method for variational inequalities is derived. This algorithm can be also viewed as performing multiplicative iterations on each level and also multiplicative iterations over the levels. In the present paper, this algorithm together with other three algorithms,which are combinations of additive or multiplicative iterations on levels with additive or multiplicative iterations over the levels, are analyzed in a unitary manner and in a more general framework which allow us to consider problems in the Sobolev space W


2002 ◽  
Vol 14 (12) ◽  
pp. 2947-2957 ◽  
Author(s):  
Tianping Chen ◽  
Wenlian Lu ◽  
Shun-ichi Amari

We discuss recurrently connected neural networks, investigating their global exponential stability (GES). Some sufficient conditions for a class of recurrent neural networks belonging to GES are given. Sharp convergence rate is given too.


Sign in / Sign up

Export Citation Format

Share Document