Exponential ergodicity in Markov renewal processes

1968 ◽  
Vol 5 (2) ◽  
pp. 387-400 ◽  
Author(s):  
Jozef L. Teugels

In [3], Kendall proved a solidarity theorem for irreducible denumerable discrete time Markov chains. Vere-Jones refined Kendall's theorem by obtaining uniform estimates [14], while Kingman proved analogous results for an irreducible continuous time Markov chain [4], [5].We derive similar solidarity theorems for an irreducible Markov renewal process. The transient case is discussed in Section 3, and Section 4 deals with the positive recurrent case. Recently Cheong also proved solidarity theorems for Semi-Markov processes [1]. His theorems use the Markovian structure, while our emphasis is on the renewal aspects of Markov renewal processes.An application to the M/G/1 queue is included in the last section.

1968 ◽  
Vol 5 (02) ◽  
pp. 387-400 ◽  
Author(s):  
Jozef L. Teugels

In [3], Kendall proved a solidarity theorem for irreducible denumerable discrete time Markov chains. Vere-Jones refined Kendall's theorem by obtaining uniform estimates [14], while Kingman proved analogous results for an irreducible continuous time Markov chain [4], [5]. We derive similar solidarity theorems for an irreducible Markov renewal process. The transient case is discussed in Section 3, and Section 4 deals with the positive recurrent case. Recently Cheong also proved solidarity theorems for Semi-Markov processes [1]. His theorems use the Markovian structure, while our emphasis is on the renewal aspects of Markov renewal processes. An application to the M/G/1 queue is included in the last section.


Mathematics ◽  
2020 ◽  
Vol 9 (1) ◽  
pp. 55
Author(s):  
P.-C.G. Vassiliou

For a G-inhomogeneous semi-Markov chain and G-inhomogeneous Markov renewal processes, we study the change from real probability measure into a forward probability measure. We find the values of risky bonds using the forward probabilities that the bond will not default up to maturity time for both processes. It is established in the form of a theorem that the forward probability measure does not alter the semi Markov structure. In addition, foundation of a G-inhohomogeneous Markov renewal process is done and a theorem is provided where it is proved that the Markov renewal process is maintained under the forward probability measure. We show that for an inhomogeneous semi-Markov there are martingales that characterize it. We show that the same is true for a Markov renewal processes. We discuss in depth the calibration of the G-inhomogeneous semi-Markov chain model and propose an algorithm for it. We conclude with an application for risky bonds.


1981 ◽  
Vol 18 (03) ◽  
pp. 752-756
Author(s):  
Per Kragh Andersen

A Markov renewal theorem necessary for the derivation of the moment formulas for a filtered Markov renewal process stated by Marcus (1974) is proved and its applications are outlined.


1999 ◽  
Vol 36 (2) ◽  
pp. 415-432 ◽  
Author(s):  
Frank Ball

In this paper, central limit theorems for multivariate semi-Markov sequences and processes are obtained, both as the number of jumps of the associated Markov chain tends to infinity and, if appropriate, as the time for which the process has been running tends to infinity. The theorems are widely applicable since many functions defined on Markov or semi-Markov processes can be analysed by exploiting appropriate embedded multivariate semi-Markov sequences. An application to a problem in ion channel modelling is described in detail. Other applications, including to multivariate stationary reward processes, counting processes associated with Markov renewal processes, the interpretation of Markov chain Monte Carlo runs and statistical inference on semi-Markov models are briefly outlined.


1999 ◽  
Vol 36 (4) ◽  
pp. 1045-1057 ◽  
Author(s):  
Yiqiang Q. Zhao ◽  
Wei Li ◽  
Attahiru Sule Alfa

In this paper, we consider a certain class of Markov renewal processes where the matrix of the transition kernel governing the Markov renewal process possesses some block-structured property, including repeating rows. Duality conditions and properties are obtained on two probabilistic measures which often play a key role in the analysis and computations of such a block-structured process. The method used here unifies two different concepts of duality. Applications of duality are also provided, including a characteristic theorem concerning recurrence and transience of a transition matrix with repeating rows and a batch arrival queueing model.


1992 ◽  
Vol 29 (01) ◽  
pp. 116-128 ◽  
Author(s):  
C. Y. Teresa Lam

In this paper, we study the new better than used in expectation (NBUE) and new worse than used in expectation (NWUE) properties of Markov renewal processes. We show that a Markov renewal process belongs to a more general class of stochastic processes encountered in reliability or maintenance applications. We present sufficient conditions such that the first-passage times of these processes are new better than used in expectation. The results are applied to the study of shock and repair models, random repair time processes, inventory, and queueing models.


1971 ◽  
Vol 3 (1) ◽  
pp. 155-175 ◽  
Author(s):  
Manfred Schäl

In this paper, some results on the asymptotic behavior of Markov renewal processes with auxiliary paths (MRPAP's) proved in other papers ([28], [29]) are applied to queueing theory. This approach to queueing problems may be regarded as an improvement of the method of Fabens [7] based on the theory of semi-Markov processes. The method of Fabens was also illustrated by Lambotte in [18], [32]. In the present paper the ordinary M/G/1 queue is generalized to allow service times to depend on the queue length immediately after the previous departure. Such models preserve the MRPAP-structure of the ordinary M/G/1 system. Recently, the asymptotic behaviour of the embedded Markov chain (MC) of this queueing model was studied by several authors. One aim of this paper is to answer the question of the relationship between the limiting distribution of the embedded MC and the limiting distribution of the original process with continuous time parameter. It turns out that these two limiting distributions coincide. Moreover some properties of the embedded MC and the embedded semi-Markov process are established. The discussion of the M/G/1 queue closes with a study of the rate-of-convergence at which the queueing process attains equilibrium.


2005 ◽  
Vol 42 (04) ◽  
pp. 1031-1043 ◽  
Author(s):  
Frank Ball ◽  
Robin K. Milne

A simple, widely applicable method is described for determining factorial moments of N̂ t , the number of occurrences in (0,t] of some event defined in terms of an underlying Markov renewal process, and asymptotic expressions for these moments as t → ∞. The factorial moment formulae combine to yield an expression for the probability generating function of N̂ t , and thereby further properties of such counts. The method is developed by considering counting processes associated with events that are determined by the states at two successive renewals of a Markov renewal process, for which it both simplifies and generalises existing results. More explicit results are given in the case of an underlying continuous-time Markov chain. The method is used to provide novel, probabilistically illuminating solutions to some problems arising in the stochastic modelling of ion channels.


2005 ◽  
Vol 42 (4) ◽  
pp. 1031-1043 ◽  
Author(s):  
Frank Ball ◽  
Robin K. Milne

A simple, widely applicable method is described for determining factorial moments of N̂t, the number of occurrences in (0,t] of some event defined in terms of an underlying Markov renewal process, and asymptotic expressions for these moments as t → ∞. The factorial moment formulae combine to yield an expression for the probability generating function of N̂t, and thereby further properties of such counts. The method is developed by considering counting processes associated with events that are determined by the states at two successive renewals of a Markov renewal process, for which it both simplifies and generalises existing results. More explicit results are given in the case of an underlying continuous-time Markov chain. The method is used to provide novel, probabilistically illuminating solutions to some problems arising in the stochastic modelling of ion channels.


1971 ◽  
Vol 3 (01) ◽  
pp. 155-175
Author(s):  
Manfred Schäl

In this paper, some results on the asymptotic behavior of Markov renewal processes with auxiliary paths (MRPAP's) proved in other papers ([28], [29]) are applied to queueing theory. This approach to queueing problems may be regarded as an improvement of the method of Fabens [7] based on the theory of semi-Markov processes. The method of Fabens was also illustrated by Lambotte in [18], [32]. In the present paper the ordinary M/G/1 queue is generalized to allow service times to depend on the queue length immediately after the previous departure. Such models preserve the MRPAP-structure of the ordinary M/G/1 system. Recently, the asymptotic behaviour of the embedded Markov chain (MC) of this queueing model was studied by several authors. One aim of this paper is to answer the question of the relationship between the limiting distribution of the embedded MC and the limiting distribution of the original process with continuous time parameter. It turns out that these two limiting distributions coincide. Moreover some properties of the embedded MC and the embedded semi-Markov process are established. The discussion of the M/G/1 queue closes with a study of the rate-of-convergence at which the queueing process attains equilibrium.


Sign in / Sign up

Export Citation Format

Share Document