Characterizations of the geometric distribution by distributional properties

1983 ◽  
Vol 20 (4) ◽  
pp. 843-850 ◽  
Author(s):  
Mynt Zijlstra

Some new characterizations of the geometric distribution are studied. A generalization of the characterization by the well-known ‘lack-of-memory' property is given together with some closely related characterizations. Furthermore the modified geometric distribution is characterized by a distributional property of the difference of two successive order statistics. The latter result extends work of Puri and Rubin (1970). Finally the geometric distribution is characterized by a conditional distribution property of the difference of two arbitrary order statistics, which generalizes a result by Arnold (1980). Some of the results given answer open questions put in earlier papers.

1983 ◽  
Vol 20 (04) ◽  
pp. 843-850 ◽  
Author(s):  
Mynt Zijlstra

Some new characterizations of the geometric distribution are studied. A generalization of the characterization by the well-known ‘lack-of-memory' property is given together with some closely related characterizations. Furthermore the modified geometric distribution is characterized by a distributional property of the difference of two successive order statistics. The latter result extends work of Puri and Rubin (1970). Finally the geometric distribution is characterized by a conditional distribution property of the difference of two arbitrary order statistics, which generalizes a result by Arnold (1980). Some of the results given answer open questions put in earlier papers.


1980 ◽  
Vol 17 (02) ◽  
pp. 570-573 ◽  
Author(s):  
Barry C. Arnold

Let X 1, X 2, …, Xn be independent identically distributed positive integer-valued random variables with order statistics X 1:n , X 2:n , …, Xn :n. If the Xi 's have a geometric distribution then the conditional distribution of Xk +1:n – Xk :n given Xk+ 1:n – Xk :n > 0 is the same as the distribution of X 1:n–k . Also the random variable X 2:n – X 1:n is independent of the event [X 1:n = 1]. Under mild conditions each of these two properties characterizes the geometric distribution.


1980 ◽  
Vol 17 (2) ◽  
pp. 570-573 ◽  
Author(s):  
Barry C. Arnold

Let X1, X2, …, Xn be independent identically distributed positive integer-valued random variables with order statistics X1:n, X2:n, …, Xn:n. If the Xi's have a geometric distribution then the conditional distribution of Xk+1:n – Xk:n given Xk+1:n – Xk:n > 0 is the same as the distribution of X1:n–k. Also the random variable X2:n – X1:n is independent of the event [X1:n = 1]. Under mild conditions each of these two properties characterizes the geometric distribution.


2020 ◽  
Vol 8 (1) ◽  
pp. 22-35
Author(s):  
M. Shakil ◽  
M. Ahsanullah

AbstractThe objective of this paper is to characterize the distribution of the condition number of a complex Gaussian matrix. Several new distributional properties of the distribution of the condition number of a complex Gaussian matrix are given. Based on such distributional properties, some characterizations of the distribution are given by truncated moment, order statistics and upper record values.


2009 ◽  
Vol 21 (3) ◽  
pp. 688-703 ◽  
Author(s):  
Vincent Q. Vu ◽  
Bin Yu ◽  
Robert E. Kass

Information estimates such as the direct method of Strong, Koberle, de Ruyter van Steveninck, and Bialek (1998) sidestep the difficult problem of estimating the joint distribution of response and stimulus by instead estimating the difference between the marginal and conditional entropies of the response. While this is an effective estimation strategy, it tempts the practitioner to ignore the role of the stimulus and the meaning of mutual information. We show here that as the number of trials increases indefinitely, the direct (or plug-in) estimate of marginal entropy converges (with probability 1) to the entropy of the time-averaged conditional distribution of the response, and the direct estimate of the conditional entropy converges to the time-averaged entropy of the conditional distribution of the response. Under joint stationarity and ergodicity of the response and stimulus, the difference of these quantities converges to the mutual information. When the stimulus is deterministic or nonstationary the direct estimate of information no longer estimates mutual information, which is no longer meaningful, but it remains a measure of variability of the response distribution across time.


2008 ◽  
Vol 45 (2) ◽  
pp. 575-579 ◽  
Author(s):  
Devdatt Dubhashi ◽  
Olle Häggström

For an order statistic (X1:n,…,Xn:n) of a collection of independent but not necessarily identically distributed random variables, and any i ∈ {1,…,n}, the conditional distribution of (Xi+1:n,…,Xn:n) given Xi:n > s is shown to be stochastically increasing in s. This answers a question by Hu and Xie (2006).


Sign in / Sign up

Export Citation Format

Share Document