Discrete Probability and Independence Questions
- Published in Uncategorized
Joint Probability Distributions
Let’s assume we have 2 discrete random variables \( X \) and \( Y \).
\( X \in \{1,2,…,n\} \) and \( Y \in \{1,2,…,n\} \)
We can diagramatically represent all the outcomes. Let n = 6
X\Y | 1 | 2 | 3 | 4 | 5 | 6 |
---|---|---|---|---|---|---|
1 | \(a_{11}\) | \(a_{12}\) | \(a_{13}\) | \(a_{14}\) | \(a_{15}\) | \(a_{16}\) |
2 | \(a_{21}\) | \(a_{22}\) | \(a_{23}\) | \(a_{24}\) | \(a_{25}\) | \(a_{26}\) |
3 | \(a_{31}\) | \(a_{32}\) | \(a_{33}\) | \(a_{34}\) | \(a_{35}\) | \(a_{36}\) |
4 | \(a_{41}\) | \(a_{42}\) | \(a_{43}\) | \(a_{44}\) | \(a_{45}\) | \(a_{46}\) |
5 | \(a_{51}\) | \(a_{52}\) | \(a_{53}\) | \(a_{54}\) | \(a_{55}\) | \(a_{56}\) |
6 | \(a_{61}\) | \(a_{62}\) | \(a_{63}\) | \(a_{64}\) | \(a_{65}\) | \(a_{66}\) |
\( P(X = i \text{ and } Y =j ) = a_{ij} \)
Marginal Distributions.
If we want to know \( P(X = i) \) irrespective of what value \(Y\) takes then we are concerned with the marginal distribution of \(X\)
$$ P(X = i) = \sum_{j=1}^n a_{ij} $$
Similarly \( P(Y = j) \) irrespective of the values of \( X \) gives the marginal distribution of \( Y \)
Example.
Consider the probability of falling over when it rains. There are 2 outcomes ‘falling/not falling’ depending on whether it has ‘rained/not rained’. Consider the following table of probabilities. What is the probability you will fall given that it rains?
Event1\Event2 | Falling | Not Falling |
---|---|---|
Rain | \( \frac{1}{2} \) | \( \frac{1}{8} \) |
No Rain | \( \frac{1}{8} \) | \( \frac{1}{4} \) |
\begin{align} P(\text{fall|rain}) = \frac{P(\text{fall and rain})}{P(\text{rain})} \end{align} \( P(\text{rain}) \) is the marginal probability that it rains so we sum over the probabilities of falling and raining and falling and not raining i.e. \( P(\text{rain}) = \frac{1}{2} + \frac{1}{8} = \frac{5}{8}\) to get \begin{align} P(\text{fall|rain}) = \frac{\frac{1}{2}}{\frac{1}{2} + \frac{1}{8}} = \frac{4}{5} \end{align}
- Published in Uncategorized
Generating Functions
The Probability Generating function of the non negative random variable, \( X \), where \( X \) has values in \( \mathbb{Z} \) is defined to be
\begin{align}
G(s) = \mathbb{E}[s^X] = \sum_k s^k \mathbb{P}(X=k) = \sum_k s^k f(k)
\end{align}
\( G(0) = \mathbb{P}(X=0) \) and \( G(1) = 1 \). Clearly \( G(s) \) converges for \( |s| < 1 \)
Generating functions are very useful to study the sums of independent random variables and to calculate the moments of a distribution.
Examples.
Constant Variables
If \( \mathbb{P}\left\{X = c\right\} = 1 \) then \( G(s) = \mathbb{E}[s^X] = s^c \)
Bernouilli variables
If \( \mathbb{P}\left\{X=1\right\} = p \) and \( \mathbb{P} \left\{X=0\right\} = 1-p \) then \( G(s) = \mathbb{E}[s^X] = 1-p + ps \)
Binomial variable
\( G(s) = \mathbb{E}[s^X] = (q + ps)^n \)
Poisson distribution
If \( X \) is a Poisson distribution with parameter \( \lambda \) then
\begin{align}
G(s) = \mathbb{E}(s^X) = \sum_{k=0}^\infty s^k \frac{\lambda^k}{k!}e^{-\lambda} = \sum_{k=0}^\infty \frac{(s\lambda)^k}{k!} e^{-\lambda} = e^{\lambda(s – 1)}
\end{align}
The following Result show how Generating functions can be used to calculate the moments of a distribution
\( \frac{dG(1)}{ds} = \mathbb{E}(X) \) and in general
\begin{align}
\frac{dG^k(1)}{ds^k} = \mathbb{E}[X(X-1)…(X-k+1)] \end{align}
.
for \( k > 1 \)
To see this take the kth derivative
\begin{align}
\frac{dG^k(s)}{ds^k} & = \sum_{n=k}^{\infty} n(n-1)…(n-k+1)s^{n-k} \mathbb{P}(X=n) \\
&=\mathbb{E}[s^{X-k}X(X-1)…(X-k+1)] \\
\end{align}
and evaluate at \( s=1 \).
Technical note: actually you need to let \( s \uparrow 1 \) and then apply Abel’s theorem.
Abel’s Theorem.
Let \( G(s) = \sum_{i=0}^{i=\infty} c_i s^i \) where \( c_i > 0 \) and \(G(s) < \infty \) for \( |s| \lt 1 \) then \( \lim_{s\uparrow 1}G(s) = G(1) \)
A closely related generating function is called the moment generating function.
The Moment Generating function of the non negative random variable, \( X \) is defined to be
\begin{align}
M(t) = \mathbb{E}[e^{tX}] = \sum_k e^{tk} \mathbb{P}(X=k) = \sum_k e^{tk} f(k)
\end{align}
Note that \( M(t) = G(e^t) \)
Using the moment generating function as the name implies is an easier way to get the moments
Taking the nth derivative of the moment generating function of \( X \) and putting \( t= 0 \) gives the nth moment of \( X \) i.e. \( \frac{d M^n(0)}{dt^n} =\mathbb{E}[X^n] \)
Proof.
\begin{align}
M(t) &= \sum_{k=0}^\infty e^{tk} \mathbb{P}(X=k) \\
& = \sum_{k=0}^\infty \left(\sum_{n=0}^\infty \frac{(tk)^n}{n!}\right) \mathbb{P}(X = k) \\
& = \sum_{n=0}^\infty \frac{t^n}{n!} \sum_{k=0}^\infty k^n \mathbb{P}(X=k)\\
& = \sum_{n=0}^\infty \frac{t^n}{n!} \mathbb{E}[X^n]\\
\end{align}
Moment generating functions can be defined for more general random variables, not necessarily discrete nor positive.
- Published in Probability and Stochastic Calculus, Uncategorized
Permutations and Combinations
(Discrete) Probability is essentially about counting. Counting the number of ways an outcome from a random experiment can happen. We need some tricks to help us. 😃
Permutations
A permutation of a set is an ordering of the set. If you have N elements in a set and you change the order you have another permutation.
Example 1.
Consider the set \( A=\{ 1,2,3 \} \). We can re-arrange the items in a set to form the following permutations
\begin{array}{cc}
(1,3,2) & (1,3,2) \\
(2,1,3) & (2,3,1) \\
(3,1,2) & (3,2,1) \\
\end{array}
We have \( 3! = 3*2*1 = 6 \) permutations (see next example for why this is the case)
The symbol \( N! \) (factorial) means \( N(N-1)…1 \)
Example 2.
Permutations of N different items.
If you have N different hats how many ways of ordering them are there? There are N ways to choose the first hat, N-1 ways to choose the second hat and so on.
We have in total $$ N(N-1)… 1 = N! $$ ways of ordering the hats.
Example 3.
Permutations with Repetition.
Suppose you N hats, K black and N-K red. Apart from the colour they are indistinguishable. How many ways of ordering them are there? We know from above N! if they are all different. However (N-K)! are indistinguishable (permutations of red hats) and K! are indistinguishable (permutations of black hats). Hence there are
$$\frac{N!}{K!(N-K)!}$$ orderings
Combinations
A combination of a set, \( A \), is a subset of \( A \), and unlike permutations above the order does not matter. An example will make this clearer.
Example 4.
Let’s suppose \( A=\{ 1,2,3 \} \). Then the following sets are all the combinations of \( A \)
\begin{array}{cc}
\{1,2\} & \{2,3\} & \{3,1\} \\
\{1\} & \{2\} & \{3\}\\
\{1,2,3\} & \emptyset & \\
\end{array}
You will notice there are 8 of them as each element in \( A \) can be chosen or not chosen for the combination ( 2 choices) and there are three elements in \( A \) all together so there are in total \( 2^3 =8 \) combinations.
To be more specific, subsets of size K are called K-combinations. Note that the sets \( \{1,2\}\) and \(\{2,1\}\) are the same 2-combination (subset) as changing the order of the elements does not create a new combination.
Lets concentrate on creating a K-combination of a set of N (distinct) elements. Note we don’t actually have to bother with the word distinct because by definition a mathematical set contains distinct elements.
Example 5.
K-combinations of an N-set
Suppose \( A=\{ 1,2,3,4 \} \) write down all the possible 3-combinations.
\begin{array}{cccc}
\{1,2,3\}&\{2,3,4\}&\{3,4,1\},&\{4,1,2\}
\end{array}
If we wanted to create all the permutations of subsets of \( A \) of size 3 we would have 4 choices for the first item, 3 choices for the second item and 2 choices for the third item i.e. 4*3*2 permutations in total. Each one of those sets above has 3! permutations as in example 1 since they contain 3 elements so that we know the number of combinations is \(\frac{4*3*2}{3!}\) or \(\frac{4!}{(4-3)!3!}\). In general the number of K-combinations from a set of size N is $$\frac{N}{(N-K)!K!}$$
\( \frac{N}{(N-K)!K!}\) is such an important formula it has its own notation.
$$ C(N,K) = \frac{N}{(N-K)!K!} $$
\( C(N,K) \) is sometimes also written as \( \binom{N}{K} \) and pronounced “N choose K”.
- Published in Probability and Stochastic Calculus, Uncategorized
Hello world!
Welcome to Kallyas Demo Sites. This is your first post. Edit or delete it, then start blogging!
- Published in Uncategorized
Hello world!
Welcome to Kallyas Demo Sites. This is your first post. Edit or delete it, then start blogging!
- Published in Uncategorized
Hello world!
Welcome to Kallyas Demo Sites. This is your first post. Edit or delete it, then start blogging!
- Published in Uncategorized
Hello world!
Welcome to Kallyas Vertigo Demo Sites. This is your first post. Edit or delete it, then start blogging!
- Published in Uncategorized
Hello world!
Welcome to Kallyas Vertigo Demo Sites. This is your first post. Edit or delete it, then start blogging!
- Published in Uncategorized
Hello world!
Welcome to Kallyas Vertigo Demo Sites. This is your first post. Edit or delete it, then start blogging!
- Published in Uncategorized