Truncation approximation of the limit probabilities for denumerable semi-Markov processes

1975 ◽  
Vol 12 (1) ◽  
pp. 161-163 ◽  
Author(s):  
Richard L. Tweedie

It is shown that methods used by the author to approximate limit probabilities for Markov processes from their Q-matrices extend to semi-Markov processes. The limit probabilities for semi-Markov processes can be approximated using only truncations of the embedded Markov chain transition matrix and the vector of mean holding times.

1975 ◽  
Vol 12 (01) ◽  
pp. 161-163 ◽  
Author(s):  
Richard L. Tweedie

It is shown that methods used by the author to approximate limit probabilities for Markov processes from their Q-matrices extend to semi-Markov processes. The limit probabilities for semi-Markov processes can be approximated using only truncations of the embedded Markov chain transition matrix and the vector of mean holding times.


Mathematics ◽  
2021 ◽  
Vol 9 (13) ◽  
pp. 1496
Author(s):  
Manuel L. Esquível ◽  
Nadezhda P. Krasii ◽  
Gracinda R. Guerreiro

We address the problem of finding a natural continuous time Markov type process—in open populations—that best captures the information provided by an open Markov chain in discrete time which is usually the sole possible observation from data. Given the open discrete time Markov chain, we single out two main approaches: In the first one, we consider a calibration procedure of a continuous time Markov process using a transition matrix of a discrete time Markov chain and we show that, when the discrete time transition matrix is embeddable in a continuous time one, the calibration problem has optimal solutions. In the second approach, we consider semi-Markov processes—and open Markov schemes—and we propose a direct extension from the discrete time theory to the continuous time one by using a known structure representation result for semi-Markov processes that decomposes the process as a sum of terms given by the products of the random variables of a discrete time Markov chain by time functions built from an adequate increasing sequence of stopping times.


2019 ◽  
Vol 44 (3) ◽  
pp. 282-308 ◽  
Author(s):  
Brian G. Vegetabile ◽  
Stephanie A. Stout-Oswald ◽  
Elysia Poggi Davis ◽  
Tallie Z. Baram ◽  
Hal S. Stern

Predictability of behavior is an important characteristic in many fields including biology, medicine, marketing, and education. When a sequence of actions performed by an individual can be modeled as a stationary time-homogeneous Markov chain the predictability of the individual’s behavior can be quantified by the entropy rate of the process. This article compares three estimators of the entropy rate of finite Markov processes. The first two methods directly estimate the entropy rate through estimates of the transition matrix and stationary distribution of the process. The third method is related to the sliding-window Lempel–Ziv compression algorithm. The methods are compared via a simulation study and in the context of a study of interactions between mothers and their children.


1982 ◽  
Vol 19 (01) ◽  
pp. 90-98
Author(s):  
J. Janssen ◽  
J. M. Reinhard

The duality results well known for classical random walk and generalized by Janssen (1976) for (J-X) processes (or sequences of random variables defined on a finite Markov chain) are extended to a class of multivariate semi-Markov processes. Just as in the classical case, these duality results lead to connections between some models of risk theory and queueing theory.


1982 ◽  
Vol 19 (1) ◽  
pp. 90-98 ◽  
Author(s):  
J. Janssen ◽  
J. M. Reinhard

The duality results well known for classical random walk and generalized by Janssen (1976) for (J-X) processes (or sequences of random variables defined on a finite Markov chain) are extended to a class of multivariate semi-Markov processes. Just as in the classical case, these duality results lead to connections between some models of risk theory and queueing theory.


2002 ◽  
Vol 34 (01) ◽  
pp. 241-259
Author(s):  
Félix Belzunce ◽  
Eva-María Ortega ◽  
José M. Ruiz

The purpose of this paper is to study ageing properties of first-passage times of increasing Markov chains. We extend the literature to some new ageing classes, such as the IFR(2), NBU(2), DRLLt and NBULt classes. We also give sufficient conditions in the finite case, that are more efficient computationally, just in terms of the transition matrix K, in the discrete case, or the generator matrix Q, in the continuous case. For the uniformizable, continuous-time Markov processes, we derive conditions in terms of the discrete uniformized Markov chain for the NBU(2) and the NBULt classes. In the last section, a review of the main results in this direction in the literature is given, and we compare some of the conditions stated in this paper with others given in the literature about some other ageing classes. Some examples where these results are applied are given.


2002 ◽  
Vol 34 (1) ◽  
pp. 241-259 ◽  
Author(s):  
Félix Belzunce ◽  
Eva-María Ortega ◽  
José M. Ruiz

The purpose of this paper is to study ageing properties of first-passage times of increasing Markov chains. We extend the literature to some new ageing classes, such as the IFR(2), NBU(2), DRLLt and NBULt classes. We also give sufficient conditions in the finite case, that are more efficient computationally, just in terms of the transition matrix K, in the discrete case, or the generator matrix Q, in the continuous case. For the uniformizable, continuous-time Markov processes, we derive conditions in terms of the discrete uniformized Markov chain for the NBU(2) and the NBULt classes. In the last section, a review of the main results in this direction in the literature is given, and we compare some of the conditions stated in this paper with others given in the literature about some other ageing classes. Some examples where these results are applied are given.


COSMOS ◽  
2005 ◽  
Vol 01 (01) ◽  
pp. 87-94 ◽  
Author(s):  
CHII-RUEY HWANG

Let π be a probability density proportional to exp - U(x) in S. A convergent Markov process to π(x) may be regarded as a "conceptual" algorithm. Assume that S is a finite set. Let X0,X1,…,Xn,… be a Markov chain with transition matrix P and invariant probability π. Under suitable condition on P, it is known that [Formula: see text] converges to π(f) and the corresponding asymptotic variance v(f, P) depends only on f and P. It is natural to consider criteria vw(P) and va(P), defined respectively by maximizing and averaging v(f, P) over f. Two families of transition matrices are considered. There are four problems to be investigated. Some results and conjectures are given. As for the continuum case, to accelerate the convergence a family of diffusions with drift ∇U(x) + C(x) with div(C(x)exp - U(x)) = 0 is considered.


1972 ◽  
Vol 4 (02) ◽  
pp. 258-270 ◽  
Author(s):  
E. Arjas

A fundamental identity, due to Miller (1961a), (1962a, b) and Kemperman (1961), is generalized to semi-Markov processes. Thus the identity applies to processes defined on a Markov chain with discrete state space and random walks with Markov dependent steps (Section 2). Wald's identity is discussed briefly in Section 3. Section 4 is a study of the maxima of partial sums, and Section 5 of maxima in a semi-Markov process.


1972 ◽  
Vol 4 (2) ◽  
pp. 258-270 ◽  
Author(s):  
E. Arjas

A fundamental identity, due to Miller (1961a), (1962a, b) and Kemperman (1961), is generalized to semi-Markov processes. Thus the identity applies to processes defined on a Markov chain with discrete state space and random walks with Markov dependent steps (Section 2). Wald's identity is discussed briefly in Section 3. Section 4 is a study of the maxima of partial sums, and Section 5 of maxima in a semi-Markov process.


Sign in / Sign up

Export Citation Format

Share Document