The stationary distribution of an interesting Markov chain

1972 ◽  
Vol 9 (1) ◽  
pp. 231-233 ◽  
Author(s):  
W. J. Hendricks

In a single-shelf library of N books we suppose that books are returned by placing them at one end of the shelf. The probability of selecting each book is assumed to be known, and only one book is removed at a time. The N! arrangements of the books are considered as states of an ergodic Markov chain for which we find the stationary distribution.

1973 ◽  
Vol 10 (4) ◽  
pp. 886-890 ◽  
Author(s):  
W. J. Hendricks

In a single-shelf library of N books we suppose that books are selected one at a time and returned to the kth position on the shelf before another selection is made. Books are moved to the right or left as necessary to vacate position k. The probability of selecting each book is assumed to be known, and the N! arrangements of the books are considered as states of an ergodic Markov chain for which we find the stationary distribution.


1973 ◽  
Vol 10 (04) ◽  
pp. 886-890 ◽  
Author(s):  
W. J. Hendricks

In a single-shelf library of N books we suppose that books are selected one at a time and returned to the kth position on the shelf before another selection is made. Books are moved to the right or left as necessary to vacate position k. The probability of selecting each book is assumed to be known, and the N! arrangements of the books are considered as states of an ergodic Markov chain for which we find the stationary distribution.


1972 ◽  
Vol 9 (01) ◽  
pp. 231-233 ◽  
Author(s):  
W. J. Hendricks

In a single-shelf library of N books we suppose that books are returned by placing them at one end of the shelf. The probability of selecting each book is assumed to be known, and only one book is removed at a time. The N! arrangements of the books are considered as states of an ergodic Markov chain for which we find the stationary distribution.


1991 ◽  
Vol 28 (1) ◽  
pp. 96-103 ◽  
Author(s):  
Daniel P. Heyman

We are given a Markov chain with states 0, 1, 2, ···. We want to get a numerical approximation of the steady-state balance equations. To do this, we truncate the chain, keeping the first n states, make the resulting matrix stochastic in some convenient way, and solve the finite system. The purpose of this paper is to provide some sufficient conditions that imply that as n tends to infinity, the stationary distributions of the truncated chains converge to the stationary distribution of the given chain. Our approach is completely probabilistic, and our conditions are given in probabilistic terms. We illustrate how to verify these conditions with five examples.


2021 ◽  
Vol 1722 ◽  
pp. 012084
Author(s):  
A L H Achmad ◽  
Mahrudinda ◽  
B N Ruchjana

1968 ◽  
Vol 5 (2) ◽  
pp. 401-413 ◽  
Author(s):  
Paul J. Schweitzer

A perturbation formalism is presented which shows how the stationary distribution and fundamental matrix of a Markov chain containing a single irreducible set of states change as the transition probabilities vary. Expressions are given for the partial derivatives of the stationary distribution and fundamental matrix with respect to the transition probabilities. Semi-group properties of the generators of transformations from one Markov chain to another are investigated. It is shown that a perturbation formalism exists in the multiple subchain case if and only if the change in the transition probabilities does not alter the number of, or intermix the various subchains. The formalism is presented when this condition is satisfied.


10.37236/1134 ◽  
2006 ◽  
Vol 13 (1) ◽  
Author(s):  
Richard Brak ◽  
Sylvie Corteel ◽  
John Essam ◽  
Robert Parviainen ◽  
Andrew Rechnitzer

We give a combinatorial derivation and interpretation of the weights associated with the stationary distribution of the partially asymmetric exclusion process. We define a set of weight equations, which the stationary distribution satisfies. These allow us to find explicit expressions for the stationary distribution and normalisation using both recurrences and path models. To show that the stationary distribution satisfies the weight equations, we construct a Markov chain on a larger set of generalised configurations. A bijection on this new set of configurations allows us to find the stationary distribution of the new chain. We then show that a subset of the generalised configurations is equivalent to the original process and that the stationary distribution on this subset is simply related to that of the original chain. We also provide a direct proof of the validity of the weight equations.


Sign in / Sign up

Export Citation Format

Share Document