Direct calculation of probabilities of sums of independent lattice random variables

1994 ◽  
Vol 34 (1) ◽  
pp. 45-52
Author(s):  
Š. Jakševičius
Entropy ◽  
2020 ◽  
Vol 22 (6) ◽  
pp. 707 ◽  
Author(s):  
Neri Merhav ◽  
Igal Sason

This work is an extension of our earlier article, where a well-known integral representation of the logarithmic function was explored and was accompanied with demonstrations of its usefulness in obtaining compact, easily-calculable, exact formulas for quantities that involve expectations of the logarithm of a positive random variable. Here, in the same spirit, we derive an exact integral representation (in one or two dimensions) of the moment of a nonnegative random variable, or the sum of such independent random variables, where the moment order is a general positive non-integer real (also known as fractional moments). The proposed formula is applied to a variety of examples with an information-theoretic motivation, and it is shown how it facilitates their numerical evaluations. In particular, when applied to the calculation of a moment of the sum of a large number, n, of nonnegative random variables, it is clear that integration over one or two dimensions, as suggested by our proposed integral representation, is significantly easier than the alternative of integrating over n dimensions, as needed in the direct calculation of the desired moment.


1996 ◽  
Vol 3 (25) ◽  
Author(s):  
Devdatt P. Dubhashi ◽  
Desh Ranjan

This paper investigates the notion of negative dependence amongst random variables and attempts to advocate its use as a simple and unifying paradigm for<br />the analysis of random structures and algorithms.<br />The assumption of independence between random variables is often very convenient for the several reasons. Firstly, it makes analyses and calculations much simpler. Secondly, one has at hand a whole array of powerful mathematical concepts and tools from classical probability theory for the analysis, such as laws of large numbers, central limit theorems and large deviation bounds which are usually derived under the assumption of independence. Unfortunately, the analysis of most randomized algorithms involves random variables that are not independent. In this case, classical tools from standard probability<br />theory like large deviation theorems, that are valid under the assumption of independence between the random variables involved, cannot be used as such. It is<br />then necessary to determine under what conditions of dependence one can still use the classical tools.<br />It has been observed before [32, 33, 38, 8], that in some situations, even though the variables involved are not independent, one can still apply some of the standard<br />tools that are valid for independent variables (directly or in suitably modified form), provided that the variables are dependent in specific ways. Unfortunately, it<br />appears that in most cases somewhat ad hoc strategems have been devised, tailored to the specific situation at hand, and that a unifying underlying theory that delves<br />deeper into the nature of dependence amongst the variables involved is lacking. A frequently occurring scenario underlying the analysis of many randomised<br />algorithms and processes involves random variables that are, intuitively, dependent in the following negative way: if one subset of the variables is "high" then a disjoint<br />subset of the variables is "low". In this paper, we bring to the forefront and systematize some precise notions of negative dependence in the literature, analyse<br />their properties, compare them relative to each other, and illustrate them with several applications.<br />One specific paradigm involving negative dependence is the classical "balls and bins" experiment. Suppose we throw m balls into n bins independently at random.<br />For i in [n], let Bi be the random variable denoting the number of balls in the ith bin. We will often refer to these variables as occupancy numbers. This is a<br />classical probabilistic paradigm [16, 22, 26] (see also [31, sec. 3.1]) that underlies the<br />analysis of many probabilistic algorithms and processes. In the case when the balls<br />are identical, this gives rise to the well-known multinomial distribution [16, sec VI.9]: there are m repeated independent trials (balls) where each trial (ball) can result in one of the outcomes E1, ..., En (bins). The probability of the realisation of event Ei is pi for i in [n] for each trial. (Of course the probabilities are subject to the condition<br />Sum_i pi = 1.) Under the multinomial distribution, for any integers<br />m1, ..., mn such that<br />Sum_i mi = m the probability that for each i in [n], event Ei<br />occurs mi times is m!<br />m1! : : :mn!pm1<br />1 : : :pmn<br />n :<br />The balls and bins experiment is a generalisation of the multinomial distribution:<br />in the general case, one can have an arbitrary set of probabilities for each ball: the<br />probability that ball k goes into bin i is pi;k, subject only to the natural restriction<br />that for each ball k,<br />P<br />i pi;k = 1. The joint distribution function correspondingly<br />has a more complicated form.<br />A fundamental natural question of interest is: how are these Bi related? Note<br />that even though the balls are thrown independently of each other, the Bi variables are not independent; in particular, their sum is fixed to m. Intuitively, the Bi's<br />are negatively dependent on each other in the manner described above: if one set<br />of variables is "high", a disjoint set is "low". However, establishing such assertions<br />precisely by a direct calculation from the joint distribution function, though possible<br />in principle, appears to be quite a formidable task, even in the case where the balls are assumed to be identical.<br />One of the major contributions of this paper is establishing that the the Bi are negatively dependent in a very strong sense. In particular, we show that the Bi variables satisfy negative association and negative regression, two strong notions of negative dependence that we define precisely below. All the intuitively obvious assertions of negative dependence in the balls and bins experiment follow as easy corollaries. We illustrate the usefulness of these results by showing how to streamline and simplify many existing probabilistic analyses in literature.


1966 ◽  
Vol 6 (4) ◽  
pp. 569-583
Author(s):  
A. Mitalauskas ◽  
V. Statulevičius

The abstracts (in two languages) can be found in the pdf file of the article. Original author name(s) and title in Russian and Lithuanian: А. А. Миталаускас, В. А. Статулявичюс. Локальные предельные теоремы и асимптотические разложения для сумм независимых решетчатых случайных величин A. Mitalauskas, V. Statulevičius. Lokalinė ribinė teorema ir asimptotinis išdėstymas nepriklausomų rėtinių atsitiktinių dydžių sumom


1986 ◽  
Vol 23 (04) ◽  
pp. 1013-1018
Author(s):  
B. G. Quinn ◽  
H. L. MacGillivray

Sufficient conditions are presented for the limiting normality of sequences of discrete random variables possessing unimodal distributions. The conditions are applied to obtain normal approximations directly for the hypergeometric distribution and the stationary distribution of a special birth-death process.


Sign in / Sign up

Export Citation Format

Share Document