On Computation of Generalized Derivatives of the Normal-Cone Mapping and Their Applications

2016 ◽  
Vol 41 (4) ◽  
pp. 1535-1556 ◽  
Author(s):  
Helmut Gfrerer ◽  
Jiří V. Outrata
2021 ◽  
Vol Volume 2 (Original research articles) ◽  
Author(s):  
Matúš Benko

In this paper, we study continuity and Lipschitzian properties of set-valued mappings, focusing on inner-type conditions. We introduce new notions of inner calmness* and, its relaxation, fuzzy inner calmness*. We show that polyhedral maps enjoy inner calmness* and examine (fuzzy) inner calmness* of a multiplier mapping associated with constraint systems in depth. Then we utilize these notions to develop some new rules of generalized differential calculus, mainly for the primal objects (e.g. tangent cones). In particular, we propose an exact chain rule for graphical derivatives. We apply these results to compute the derivatives of the normal cone mapping, essential e.g. for sensitivity analysis of variational inequalities. Comment: 27 pages


Author(s):  
Vladimir Kulish ◽  
Kirill V. Poletkin

The paper presents an integral solution of the generalized one-dimensional phase-lagging heat equation with the convective term. The solution of the problem has been achieved by the use of a novel technique that involves generalized derivatives (in particular, derivatives of non-integer orders). Confluent hypergeometric functions, known as Whittaker’s functions, appear in the course of the solution procedure, upon applying the Laplace transform to the original transport equation. The analytical solution of the problem is written in the integral form and provides a relationship between the local values of the temperature and heat flux. The solution is valid everywhere within the domain, including the domain boundary.


2011 ◽  
Vol 23 (5) ◽  
pp. 1343-1392 ◽  
Author(s):  
Thomas Villmann ◽  
Sven Haase

Supervised and unsupervised vector quantization methods for classification and clustering traditionally use dissimilarities, frequently taken as Euclidean distances. In this article, we investigate the applicability of divergences instead, focusing on online learning. We deduce the mathematical fundamentals for its utilization in gradient-based online vector quantization algorithms. It bears on the generalized derivatives of the divergences known as Fréchet derivatives in functional analysis, which reduces in finite-dimensional problems to partial derivatives in a natural way. We demonstrate the application of this methodology for widely applied supervised and unsupervised online vector quantization schemes, including self-organizing maps, neural gas, and learning vector quantization. Additionally, principles for hyperparameter optimization and relevance learning for parameterized divergences in the case of supervised vector quantization are given to achieve improved classification accuracy.


Sign in / Sign up

Export Citation Format

Share Document