Exact order of Hoffman’s error bounds for elliptic quadratic inequalities derived from vector-valued Chebyshev approximation

2000 ◽  
Vol 88 (2) ◽  
pp. 223-253 ◽  
Author(s):  
Martin Bartelt ◽  
Wu Li
Author(s):  
Yuanxin Ma ◽  
Hongwei Sun

In this paper, the regression learning algorithm with vector-valued RKHS is studied. We motivate the need for extending learning theory of scalar-valued functions and analze the learning performance. In this setting, the output data are from a Hilbert space [Formula: see text], the associated RKHS consists of functions with values lie in [Formula: see text]. By providing mathematical aspects of vector-valued integral operator [Formula: see text], the capacity independent error bounds and learning rates are derived by means of the integral operator technique.


2019 ◽  
Vol 25 ◽  
pp. 55
Author(s):  
Xi Yin Zheng ◽  
Kung-Fu Ng

Under either linearity or convexity assumption, several authors have studied the stability of error bounds for inequality systems when the concerned data undergo small perturbations. In this paper, we consider the corresponding issue for a more general conic inequality (most of the constraint systems in optimization can be described by an inequality of this type). In terms of coderivatives for vector-valued functions, we study perturbation analysis of error bounds for conic inequalities in the subsmooth setting. The main results of this paper are new even in the convex/smooth case.


Sign in / Sign up

Export Citation Format

Share Document