The Probability Generating function of the non negative random variable, \( X \), where \( X \) has values in \( \mathbb{Z} \) is defined to be
\begin{align}
G(s) = \mathbb{E}[s^X] = \sum_k s^k \mathbb{P}(X=k) = \sum_k s^k f(k)
\end{align}
\( G(0) = \mathbb{P}(X=0) \) and \( G(1) = 1 \). Clearly \( G(s) \) converges for \( |s| < 1 \)
Generating functions are very useful to study the sums of independent random variables and to calculate the moments of a distribution.
Examples.
Constant Variables
If \( \mathbb{P}\left\{X = c\right\} = 1 \) then \( G(s) = \mathbb{E}[s^X] = s^c \)
Bernouilli variables
If \( \mathbb{P}\left\{X=1\right\} = p \) and \( \mathbb{P} \left\{X=0\right\} = 1-p \) then \( G(s) = \mathbb{E}[s^X] = 1-p + ps \)
Binomial variable
\( G(s) = \mathbb{E}[s^X] = (q + ps)^n \)
Poisson distribution
If \( X \) is a Poisson distribution with parameter \( \lambda \) then
\begin{align}
G(s) = \mathbb{E}(s^X) = \sum_{k=0}^\infty s^k \frac{\lambda^k}{k!}e^{-\lambda} = \sum_{k=0}^\infty \frac{(s\lambda)^k}{k!} e^{-\lambda} = e^{\lambda(s – 1)}
\end{align}
The following Result show how Generating functions can be used to calculate the moments of a distribution
\( \frac{dG(1)}{ds} = \mathbb{E}(X) \) and in general
\begin{align}
\frac{dG^k(1)}{ds^k} = \mathbb{E}[X(X-1)…(X-k+1)] \end{align}
.
for \( k > 1 \)
To see this take the kth derivative
\begin{align}
\frac{dG^k(s)}{ds^k} & = \sum_{n=k}^{\infty} n(n-1)…(n-k+1)s^{n-k} \mathbb{P}(X=n) \\
&=\mathbb{E}[s^{X-k}X(X-1)…(X-k+1)] \\
\end{align}
and evaluate at \( s=1 \).
Technical note: actually you need to let \( s \uparrow 1 \) and then apply Abel’s theorem.
Abel’s Theorem.
Let \( G(s) = \sum_{i=0}^{i=\infty} c_i s^i \) where \( c_i > 0 \) and \(G(s) < \infty \) for \( |s| \lt 1 \) then \( \lim_{s\uparrow 1}G(s) = G(1) \)
A closely related generating function is called the moment generating function.
The Moment Generating function of the non negative random variable, \( X \) is defined to be
\begin{align}
M(t) = \mathbb{E}[e^{tX}] = \sum_k e^{tk} \mathbb{P}(X=k) = \sum_k e^{tk} f(k)
\end{align}
Note that \( M(t) = G(e^t) \)
Using the moment generating function as the name implies is an easier way to get the moments
Taking the nth derivative of the moment generating function of \( X \) and putting \( t= 0 \) gives the nth moment of \( X \) i.e. \( \frac{d M^n(0)}{dt^n} =\mathbb{E}[X^n] \)
Proof.
\begin{align}
M(t) &= \sum_{k=0}^\infty e^{tk} \mathbb{P}(X=k) \\
& = \sum_{k=0}^\infty \left(\sum_{n=0}^\infty \frac{(tk)^n}{n!}\right) \mathbb{P}(X = k) \\
& = \sum_{n=0}^\infty \frac{t^n}{n!} \sum_{k=0}^\infty k^n \mathbb{P}(X=k)\\
& = \sum_{n=0}^\infty \frac{t^n}{n!} \mathbb{E}[X^n]\\
\end{align}
Moment generating functions can be defined for more general random variables, not necessarily discrete nor positive.