Generating functions are powerful tools for analyzing discrete random variables. They provide compact representations of probability distributions and simplify calculations of important properties like moments and convolutions.
This section explores three types of generating functions: probability, moment, and cumulant. Each offers unique insights into distribution characteristics and helps solve complex probability problems more efficiently.
Generating Functions
Probability Generating Functions
Top images from around the web for Probability Generating Functions Binomial distribution - Wikipedia View original
Is this image relevant?
Poisson distribution - Wikipedia View original
Is this image relevant?
Binomial distribution - Wikipedia View original
Is this image relevant?
Binomial distribution - Wikipedia View original
Is this image relevant?
Poisson distribution - Wikipedia View original
Is this image relevant?
1 of 3
Top images from around the web for Probability Generating Functions Binomial distribution - Wikipedia View original
Is this image relevant?
Poisson distribution - Wikipedia View original
Is this image relevant?
Binomial distribution - Wikipedia View original
Is this image relevant?
Binomial distribution - Wikipedia View original
Is this image relevant?
Poisson distribution - Wikipedia View original
Is this image relevant?
1 of 3
Probability generating function (PGF ) defines a discrete probability distribution
Represented as G X ( s ) = E [ s X ] = ∑ k = 0 ∞ p k s k G_X(s) = E[s^X] = \sum_{k=0}^{\infty} p_k s^k G X ( s ) = E [ s X ] = ∑ k = 0 ∞ p k s k
Provides a compact representation of the entire probability distribution
Useful for calculating moments and deriving properties of random variables
Differentiation of PGF yields factorial moments of the distribution
Evaluating PGF at s = 1 always results in 1, as G X ( 1 ) = ∑ k = 0 ∞ p k = 1 G_X(1) = \sum_{k=0}^{\infty} p_k = 1 G X ( 1 ) = ∑ k = 0 ∞ p k = 1
PGF of Poisson distribution expressed as G X ( s ) = e λ ( s − 1 ) G_X(s) = e^{\lambda(s-1)} G X ( s ) = e λ ( s − 1 )
Binomial distribution PGF given by G X ( s ) = ( q + p s ) n G_X(s) = (q + ps)^n G X ( s ) = ( q + p s ) n , where p is success probability and q = 1 - p
Moment Generating Functions
Moment generating function (MGF ) alternative representation of probability distribution
Defined as M X ( t ) = E [ e t X ] = ∑ k = 0 ∞ e t k p k M_X(t) = E[e^{tX}] = \sum_{k=0}^{\infty} e^{tk} p_k M X ( t ) = E [ e tX ] = ∑ k = 0 ∞ e t k p k
Generates moments of the distribution through differentiation
nth moment obtained by evaluating nth derivative of MGF at t = 0
MGF uniquely determines the probability distribution
Useful for proving distribution properties and deriving sums of independent random variables
MGF of normal distribution given by M X ( t ) = e μ t + 1 2 σ 2 t 2 M_X(t) = e^{\mu t + \frac{1}{2}\sigma^2t^2} M X ( t ) = e μ t + 2 1 σ 2 t 2
Exponential distribution MGF expressed as M X ( t ) = λ λ − t M_X(t) = \frac{\lambda}{\lambda - t} M X ( t ) = λ − t λ for t < λ
Cumulant Generating Functions
Cumulant generating function (CGF ) natural logarithm of moment generating function
Defined as K X ( t ) = ln ( M X ( t ) ) K_X(t) = \ln(M_X(t)) K X ( t ) = ln ( M X ( t ))
Generates cumulants of the distribution through differentiation
Cumulants provide alternative set of distribution descriptors to moments
First cumulant equals mean, second cumulant equals variance
Higher-order cumulants measure deviations from normality
CGF of Poisson distribution given by K X ( t ) = λ ( e t − 1 ) K_X(t) = \lambda(e^t - 1) K X ( t ) = λ ( e t − 1 )
Normal distribution CGF expressed as K X ( t ) = μ t + 1 2 σ 2 t 2 K_X(t) = \mu t + \frac{1}{2}\sigma^2t^2 K X ( t ) = μ t + 2 1 σ 2 t 2
Combining Distributions
Convolution of Distributions
Convolution operation combines two independent random variables
Resulting distribution represents sum of the two random variables
Probability mass function (PMF) of convolution given by p Z ( k ) = ∑ i = 0 k p X ( i ) p Y ( k − i ) p_Z(k) = \sum_{i=0}^k p_X(i)p_Y(k-i) p Z ( k ) = ∑ i = 0 k p X ( i ) p Y ( k − i )
Convolution of two Poisson distributions yields another Poisson distribution
Sum of independent normal distributions results in normal distribution
PGF of convolution equals product of individual PGFs
MGF of convolution also equals product of individual MGFs
Convolution useful in analyzing queuing systems and network traffic
Compound Distributions
Compound distribution arises when parameter of one distribution is itself a random variable
Combines two or more probability distributions
Common compound distributions include compound Poisson and negative binomial
Compound Poisson distribution models number of events with random batch sizes
PGF of compound distribution given by G X ( s ) = G N ( G Y ( s ) ) G_X(s) = G_N(G_Y(s)) G X ( s ) = G N ( G Y ( s ))
MGF of compound distribution expressed as M X ( t ) = M N ( ln ( M Y ( t ) ) ) M_X(t) = M_N(\ln(M_Y(t))) M X ( t ) = M N ( ln ( M Y ( t )))
Useful in modeling insurance claims, particle physics, and telecommunications
Compound binomial distribution arises in credit risk modeling
Compound geometric distribution applied in reliability theory and actuarial science
Moments
Factorial Moments
Factorial moments alternative to ordinary moments for characterizing distributions
nth factorial moment defined as E [ X ( X − 1 ) ( X − 2 ) . . . ( X − n + 1 ) ] E[X(X-1)(X-2)...(X-n+1)] E [ X ( X − 1 ) ( X − 2 ) ... ( X − n + 1 )]
Generated by successive differentiation of probability generating function
Factorial moments often simpler to calculate than ordinary moments
Relate to ordinary moments through Stirling numbers of the second kind
Useful in combinatorial problems and analysis of branching processes
Factorial moments of Poisson distribution all equal to λ
Binomial distribution factorial moments given by E [ X ( n ) ] = n ! ( N n ) p n E[X^{(n)}] = n!{N \choose n}p^n E [ X ( n ) ] = n ! ( n N ) p n
Negative binomial distribution factorial moments expressed as E [ X ( n ) ] = r ( r + 1 ) . . . ( r + n − 1 ) ( 1 − p ) n p n E[X^{(n)}] = \frac{r(r+1)...(r+n-1)}{(1-p)^n}p^n E [ X ( n ) ] = ( 1 − p ) n r ( r + 1 ) ... ( r + n − 1 ) p n