scholarly journals Partially Gaussian stationary stochastic processes in discrete time

2013 ◽  
Vol 7 (2) ◽  
Author(s):  
K R Parthasarathy
2006 ◽  
Vol 06 (02) ◽  
pp. 173-183 ◽  
Author(s):  
DALIBOR VOLNÝ

We generalise the martingale-coboundary representation of discrete time stochastic processes to the non-stationary case and to random variables in Orlicz spaces. Related limit theorems (CLT, invariance principle, log–log law, probabilities of large deviations) are studied.


1995 ◽  
Vol 32 (4) ◽  
pp. 917-921
Author(s):  
Takis Konstantopoulos ◽  
Michael Zazanis

Neveu's exchange formula relates the Palm probabilities with respect to two jointly stationary simple point processes. We give a new proof of the exchange formula by using a simple result from discrete time stationary stochastic processes.


1995 ◽  
Vol 32 (04) ◽  
pp. 917-921
Author(s):  
Takis Konstantopoulos ◽  
Michael Zazanis

Neveu's exchange formula relates the Palm probabilities with respect to two jointly stationary simple point processes. We give a new proof of the exchange formula by using a simple result from discrete time stationary stochastic processes.


2016 ◽  
Vol 28 (12) ◽  
pp. 2853-2889 ◽  
Author(s):  
Hanyuan Hang ◽  
Yunlong Feng ◽  
Ingo Steinwart ◽  
Johan A. K. Suykens

This letter investigates the supervised learning problem with observations drawn from certain general stationary stochastic processes. Here by general, we mean that many stationary stochastic processes can be included. We show that when the stochastic processes satisfy a generalized Bernstein-type inequality, a unified treatment on analyzing the learning schemes with various mixing processes can be conducted and a sharp oracle inequality for generic regularized empirical risk minimization schemes can be established. The obtained oracle inequality is then applied to derive convergence rates for several learning schemes such as empirical risk minimization (ERM), least squares support vector machines (LS-SVMs) using given generic kernels, and SVMs using gaussian kernels for both least squares and quantile regression. It turns out that for independent and identically distributed (i.i.d.) processes, our learning rates for ERM recover the optimal rates. For non-i.i.d. processes, including geometrically [Formula: see text]-mixing Markov processes, geometrically [Formula: see text]-mixing processes with restricted decay, [Formula: see text]-mixing processes, and (time-reversed) geometrically [Formula: see text]-mixing processes, our learning rates for SVMs with gaussian kernels match, up to some arbitrarily small extra term in the exponent, the optimal rates. For the remaining cases, our rates are at least close to the optimal rates. As a by-product, the assumed generalized Bernstein-type inequality also provides an interpretation of the so-called effective number of observations for various mixing processes.


2018 ◽  
Vol 14 (1) ◽  
pp. 7540-7559
Author(s):  
MI lOS lAWA SOKO

Virtually every biological model utilising a random number generator is a Markov stochastic process. Numerical simulations of such processes are performed using stochastic or intensity matrices or kernels. Biologists, however, define stochastic processes in a slightly different way to how mathematicians typically do. A discrete-time discrete-value stochastic process may be defined by a function p : X0 × X → {f : Î¥ → [0, 1]}, where X is a set of states, X0 is a bounded subset of X, Î¥ is a subset of integers (here associated with discrete time), where the function p satisfies 0 < p(x, y)(t) < 1 and  EY p(x, y)(t) = 1. This definition generalizes a stochastic matrix. Although X0 is bounded, X may include every possible state and is often infinite. By interrupting the process whenever the state transitions into the X −X0 set, Markov stochastic processes defined this way may have non-quadratic stochastic matrices. Similar principle applies to intensity matrices, stochastic and intensity kernels resulting from considering many biological models as Markov stochastic processes. Class of such processes has important properties when considered from a point of view of theoretical mathematics. In particular, every process from this class may be simulated (hence they all exist in a physical sense) and has a well-defined probabilistic space associated with it.


Sign in / Sign up

Export Citation Format

Share Document