Discrete Probability and Independence Questions
- Published in Uncategorized
Joint Probability Distributions
Let’s assume we have 2 discrete random variables X and Y.
X∈{1,2,…,n} and Y∈{1,2,…,n}
We can diagramatically represent all the outcomes. Let n = 6
X\Y | 1 | 2 | 3 | 4 | 5 | 6 |
---|---|---|---|---|---|---|
1 | a11 | a12 | a13 | a14 | a15 | a16 |
2 | a21 | a22 | a23 | a24 | a25 | a26 |
3 | a31 | a32 | a33 | a34 | a35 | a36 |
4 | a41 | a42 | a43 | a44 | a45 | a46 |
5 | a51 | a52 | a53 | a54 | a55 | a56 |
6 | a61 | a62 | a63 | a64 | a65 | a66 |
P(X=i and Y=j)=aij
Marginal Distributions.
If we want to know P(X=i) irrespective of what value Y takes then we are concerned with the marginal distribution of X
P(X=i)=n∑j=1aij
Similarly P(Y=j) irrespective of the values of X gives the marginal distribution of Y
Example.
Consider the probability of falling over when it rains. There are 2 outcomes ‘falling/not falling’ depending on whether it has ‘rained/not rained’. Consider the following table of probabilities. What is the probability you will fall given that it rains?
Event1\Event2 | Falling | Not Falling |
---|---|---|
Rain | 12 | 18 |
No Rain | 18 | 14 |
P(fall|rain)=P(fall and rain)P(rain) P(rain) is the marginal probability that it rains so we sum over the probabilities of falling and raining and falling and not raining i.e. P(rain)=12+18=58 to get P(fall|rain)=1212+18=45
- Published in Uncategorized
Generating Functions
The Probability Generating function of the non negative random variable, X, where X has values in Z is defined to be
G(s)=E[sX]=∑kskP(X=k)=∑kskf(k)
G(0)=P(X=0) and G(1)=1. Clearly G(s) converges for |s|<1
Generating functions are very useful to study the sums of independent random variables and to calculate the moments of a distribution.
Examples.
Constant Variables
If P{X=c}=1 then G(s)=E[sX]=sc
Bernouilli variables
If P{X=1}=p and P{X=0}=1−p then G(s)=E[sX]=1−p+ps
Binomial variable
G(s)=E[sX]=(q+ps)n
Poisson distribution
If X is a Poisson distribution with parameter λ then
\begin{align}
G(s) = \mathbb{E}(s^X) = \sum_{k=0}^\infty s^k \frac{\lambda^k}{k!}e^{-\lambda} = \sum_{k=0}^\infty \frac{(s\lambda)^k}{k!} e^{-\lambda} = e^{\lambda(s – 1)}
\end{align}
The following Result show how Generating functions can be used to calculate the moments of a distribution
\frac{dG(1)}{ds} = \mathbb{E}(X) and in general
\begin{align}
\frac{dG^k(1)}{ds^k} = \mathbb{E}[X(X-1)…(X-k+1)] \end{align}
.
for k > 1
To see this take the kth derivative
\begin{align}
\frac{dG^k(s)}{ds^k} & = \sum_{n=k}^{\infty} n(n-1)…(n-k+1)s^{n-k} \mathbb{P}(X=n) \\
&=\mathbb{E}[s^{X-k}X(X-1)…(X-k+1)] \\
\end{align}
and evaluate at s=1 .
Technical note: actually you need to let s \uparrow 1 and then apply Abel’s theorem.
Abel’s Theorem.
Let G(s) = \sum_{i=0}^{i=\infty} c_i s^i where c_i > 0 and G(s) < \infty for |s| \lt 1 then \lim_{s\uparrow 1}G(s) = G(1)
A closely related generating function is called the moment generating function.
The Moment Generating function of the non negative random variable, X is defined to be
\begin{align} M(t) = \mathbb{E}[e^{tX}] = \sum_k e^{tk} \mathbb{P}(X=k) = \sum_k e^{tk} f(k) \end{align}
Note that M(t) = G(e^t)
Using the moment generating function as the name implies is an easier way to get the moments
Taking the nth derivative of the moment generating function of X and putting t= 0 gives the nth moment of X i.e. \frac{d M^n(0)}{dt^n} =\mathbb{E}[X^n]
Proof.
\begin{align}
M(t) &= \sum_{k=0}^\infty e^{tk} \mathbb{P}(X=k) \\
& = \sum_{k=0}^\infty \left(\sum_{n=0}^\infty \frac{(tk)^n}{n!}\right) \mathbb{P}(X = k) \\
& = \sum_{n=0}^\infty \frac{t^n}{n!} \sum_{k=0}^\infty k^n \mathbb{P}(X=k)\\
& = \sum_{n=0}^\infty \frac{t^n}{n!} \mathbb{E}[X^n]\\
\end{align}
Moment generating functions can be defined for more general random variables, not necessarily discrete nor positive.
- Published in Probability and Stochastic Calculus, Uncategorized
Permutations and Combinations
(Discrete) Probability is essentially about counting. Counting the number of ways an outcome from a random experiment can happen. We need some tricks to help us. 😃
Permutations
A permutation of a set is an ordering of the set. If you have N elements in a set and you change the order you have another permutation.
Example 1.
Consider the set A=\{ 1,2,3 \} . We can re-arrange the items in a set to form the following permutations
\begin{array}{cc}
(1,3,2) & (1,3,2) \\
(2,1,3) & (2,3,1) \\
(3,1,2) & (3,2,1) \\
\end{array}
We have 3! = 3*2*1 = 6 permutations (see next example for why this is the case)
The symbol N! (factorial) means N(N-1)…1

Example 2.
Permutations of N different items.
If you have N different hats how many ways of ordering them are there? There are N ways to choose the first hat, N-1 ways to choose the second hat and so on.
We have in total N(N-1)… 1 = N! ways of ordering the hats.

Example 3.
Permutations with Repetition.
Suppose you N hats, K black and N-K red. Apart from the colour they are indistinguishable. How many ways of ordering them are there? We know from above N! if they are all different. However (N-K)! are indistinguishable (permutations of red hats) and K! are indistinguishable (permutations of black hats). Hence there are
\frac{N!}{K!(N-K)!} orderings
Combinations
A combination of a set, A , is a subset of A , and unlike permutations above the order does not matter. An example will make this clearer.
Example 4.
Let’s suppose A=\{ 1,2,3 \} . Then the following sets are all the combinations of A
\begin{array}{cc}
\{1,2\} & \{2,3\} & \{3,1\} \\
\{1\} & \{2\} & \{3\}\\
\{1,2,3\} & \emptyset & \\
\end{array}
You will notice there are 8 of them as each element in A can be chosen or not chosen for the combination ( 2 choices) and there are three elements in A all together so there are in total 2^3 =8 combinations.
To be more specific, subsets of size K are called K-combinations. Note that the sets \{1,2\} and \{2,1\} are the same 2-combination (subset) as changing the order of the elements does not create a new combination.
Lets concentrate on creating a K-combination of a set of N (distinct) elements. Note we don’t actually have to bother with the word distinct because by definition a mathematical set contains distinct elements.
Example 5.
K-combinations of an N-set
Suppose A=\{ 1,2,3,4 \} write down all the possible 3-combinations.
\begin{array}{cccc}
\{1,2,3\}&\{2,3,4\}&\{3,4,1\},&\{4,1,2\}
\end{array}
If we wanted to create all the permutations of subsets of A of size 3 we would have 4 choices for the first item, 3 choices for the second item and 2 choices for the third item i.e. 4*3*2 permutations in total. Each one of those sets above has 3! permutations as in example 1 since they contain 3 elements so that we know the number of combinations is \frac{4*3*2}{3!} or \frac{4!}{(4-3)!3!}. In general the number of K-combinations from a set of size N is \frac{N}{(N-K)!K!}
\frac{N}{(N-K)!K!} is such an important formula it has its own notation.
C(N,K) = \frac{N}{(N-K)!K!}
C(N,K) is sometimes also written as \binom{N}{K} and pronounced “N choose K”.
- Published in Probability and Stochastic Calculus, Uncategorized
Hello world!
Welcome to Kallyas Demo Sites. This is your first post. Edit or delete it, then start blogging!
- Published in Uncategorized
Hello world!
Welcome to Kallyas Demo Sites. This is your first post. Edit or delete it, then start blogging!
- Published in Uncategorized
Hello world!
Welcome to Kallyas Demo Sites. This is your first post. Edit or delete it, then start blogging!
- Published in Uncategorized
Hello world!
Welcome to Kallyas Vertigo Demo Sites. This is your first post. Edit or delete it, then start blogging!
- Published in Uncategorized
Hello world!
Welcome to Kallyas Vertigo Demo Sites. This is your first post. Edit or delete it, then start blogging!
- Published in Uncategorized
Hello world!
Welcome to Kallyas Vertigo Demo Sites. This is your first post. Edit or delete it, then start blogging!
- Published in Uncategorized