Approximation of probability measures by convex combinations of measures of spherically invariant processes

1981 ◽  
Vol 29 (4) ◽  
pp. 320-324
Author(s):  
I. V. Kozin
Fractals ◽  
2018 ◽  
Vol 26 (05) ◽  
pp. 1850076 ◽  
Author(s):  
D. LA TORRE ◽  
E. MAKI ◽  
F. MENDIVIL ◽  
E. R. VRSCAY

We are concerned with the approximation of probability measures on a compact metric space [Formula: see text] by invariant measures of iterated function systems with place-dependent probabilities (IFSPDPs). The approximation is performed by moment matching. Associated with an IFSPDP is a linear operator [Formula: see text], where [Formula: see text] denotes the set of all infinite moment vectors of probability measures on [Formula: see text]. Let [Formula: see text] be a probability measure that we desire to approximate, with moment vector [Formula: see text]. We then look for an IFSPDP which maps [Formula: see text] as close to itself as possible in terms of an appropriate metric on [Formula: see text]. Some computational results are presented.


2020 ◽  
Vol 4 (1) ◽  
pp. 29-39
Author(s):  
Dilrabo Eshkobilova ◽  

Uniform properties of the functor Iof idempotent probability measures with compact support are studied. It is proved that this functor can be lifted to the category Unif of uniform spaces and uniformly continuous maps


2020 ◽  
pp. 1-13
Author(s):  
SEBASTIÁN PAVEZ-MOLINA

Abstract Let $(X,T)$ be a topological dynamical system. Given a continuous vector-valued function $F \in C(X, \mathbb {R}^{d})$ called a potential, we define its rotation set $R(F)$ as the set of integrals of F with respect to all T-invariant probability measures, which is a convex body of $\mathbb {R}^{d}$ . In this paper we study the geometry of rotation sets. We prove that if T is a non-uniquely ergodic topological dynamical system with a dense set of periodic measures, then the map $R(\cdot )$ is open with respect to the uniform topologies. As a consequence, we obtain that the rotation set of a generic potential is strictly convex and has $C^{1}$ boundary. Furthermore, we prove that the map $R(\cdot )$ is surjective, extending a result of Kucherenko and Wolf.


Entropy ◽  
2021 ◽  
Vol 23 (4) ◽  
pp. 464
Author(s):  
Frank Nielsen

We generalize the Jensen-Shannon divergence and the Jensen-Shannon diversity index by considering a variational definition with respect to a generic mean, thereby extending the notion of Sibson’s information radius. The variational definition applies to any arbitrary distance and yields a new way to define a Jensen-Shannon symmetrization of distances. When the variational optimization is further constrained to belong to prescribed families of probability measures, we get relative Jensen-Shannon divergences and their equivalent Jensen-Shannon symmetrizations of distances that generalize the concept of information projections. Finally, we touch upon applications of these variational Jensen-Shannon divergences and diversity indices to clustering and quantization tasks of probability measures, including statistical mixtures.


Sign in / Sign up

Export Citation Format

Share Document