Notes on Independence
My notes on Independence.
✔ Two events $E$ and $F$ are independent if $P(E|F) = P(E)$. You would say "$E$ and $F$ are independent."
✔ Independence is symmetrical, therefore if $E$ and $F$ are independent, then $E$ and $F^{c}$ are independent
✔ Independence of EVENTS (not RVs) $\Leftrightarrow P(EF) = P(E)P(F)$
✔ Independence of RVs $\Leftrightarrow P(X \in A, Y \in B) = P(X\in A)P(Y \in B)$ for ANY sets of real numbers A and B
✔ To prove dependence you just need one example, to prove independence you need to think of every possible case.
For example, you flip a fair coin twice. Let $X$ be the outcome of the first flip, where $X = 1$ for heads and $X = 0$ for tails. Let $Z$ be the total number of heads on two flips.
If $Z = 0$ then you already know that $X = 0$. Therefore $X$ and $Z$ not independent and it only took one case to show this.
Note that $X$ and $Z-X$ are independent! For example if $Z-X =0$ then either $X$ is $0$ or $1$ with equal probability (HT $1-1=0$ or TT $0-0=0$). If $Z-X = 1$, either $X$ is $0$ or $1$ with equal prob (HH $2-1=1$ or TH $1-0=1$)
✔ From the 3 axioms of probability you can say Independence $\Leftrightarrow F_{X,Y}(a,b) = F_X(a)F_Y(b)\quad \forall a,b$
✔ When $X$ and $Y$ are discrete RVs, Independence is equivalent to $p_{X,Y}(x,y) = p_X(x)p_Y(y) \quad \forall x,y$
✔ When $X$ and $Y$ are jointly continuous, then independence is equivalent to $f_{X,Y}(x,y) = f_X(x)f_Y(y) \quad \forall x,y$
✔ Cond'l Independence $P(E|FG) = P(E|G)$ and you say "E and F are independent given G"
✔ Cond'l Independence means $P(EF|G) = P(E|G)P(F|G)$ and the proof is as follows,
$$P(E|FG) = P(E|G)$$ $$ \Rightarrow \frac{P(EFG)}{P(FG)} = \frac{P(EG)}{P(G)} $$And by the Multipliclation Rule, $P(FG) = P(F|G)P(G)$ $$ \Rightarrow \frac{P(EFG)}{P(F|G)P(G)} = \frac{P(EG)}{P(G)} $$ $$ \Rightarrow \frac{P(EFG)}{P(G)} = \frac{P(EG)}{P(G)}P(F|G) $$ $$ \Rightarrow P(EF|G) = P(E|G)P(F|G)$$
✔ Independence does not always imply Conditional independence and cond'l independence does not always imply independence
✔ Independence does not imply condl independence. Flip 2 coins. Let E = heads on first, F = heads on second, G = two coins are same. Then E,F,G are pairwise independent, $P(E|FG) \neq P(E|F)$ therefore E and G not independent given F.
✔ Condl Independence does not imply Independence. "A and A are conditionally independent given A." Since $P(A|AA) = P(A|A) = 1$ but , but A is not independent of itself $P(A|A) \neq P(A)$ since $P(A|A) = 1$ which may not necessarily equal (only if A is the entire sample space)
✔ Another example. D is uniform cont from (0,1) and A,B,C are all uniform cont from (0,D). Knowing A does not help knowing B, given D. Therefore A,B,C are conditionally independent given D. But A,B,C are not independent of each other. For if you know B, then that tells you information about A. Let's say B = 0.9, then that gives you information about the potential range of A. -- Revisit after you understand how to do conditioning not just on events, but RVs
✔ Interesting idea from stackexchange: "Functional dependence does not imply stochastic dependence"
For example $X =$ head or tail on first flip, $Y =$ head or tail on second flip, $X$ and $Y$ are $\sim Bernoulli(p=0.5)$.
Let $Z =$ total number of heads on both flips. Then $X$ and $Z$ are dependent (since knowing $Z = 0$ means $X = 0$), and $X$ and $Z-X$ are independent even though $X$ and $Z$ are dependent.
Note that $Z = X+Y$, so $Z-X = Y$ and clearly $X$ and $Y$ are independent. Write out all the cases of X and $Z-X$. $Z-X$ can be either $0$ or $1$. But knowing this does not tell you the distribution of $X$ alone. So they are independent. Because if $Z- X = 1$ or if $Z - X = 0$ then either way $X$ can be $0$ or $1$ with equal probability and you have not gained information of what X is alone. Even though to calculate $Z - X$ you clearly need to know X.
Draw out the table. For me one confusion is to remember what $Z-X$ can equal. $Z$ can equal $2$ but $Z-X$ is either $0$ or $1$.
\begin{array}{c|c|c|c} \hline \textbf{First flip} & \textbf{Second flip} & \textbf{Total Heads} & \\ X & Y & Z & Z-X \\ \hline 0 & 0 & 0 & ? \\ 1 & 0 & 1 & ? \\ 0 & 1 & 1 & ? \\ 1 & 1 & 2 & ? \\ \end{array}Now what can $Z-X$ be? If $Z-X$ is either $0$ or $1$, does it tell you the distribution of $X$ alone?
Articles
Personal notes I've written over the years.
- When does the Binomial become approximately Normal
- Gambler's ruin problem
- The t-distribution becomes Normal as n increases
- Marcus Aurelius on death
- Proof of the Central Limit Theorem
- Proof of the Strong Law of Large Numbers
- Deriving Multiple Linear Regression
- Safety stock formula derivation
- Derivation of the Normal Distribution
- Comparing means of Normal populations
- Concentrate like a Roman
- How to read a Regression summary in R
- Notes on Expected Value
- How to read an ANOVA summary in R
- The time I lost faith in Expected Value
- Notes on Weighted Linear Regression
- How information can update Conditional Probability
- Coupon collecting singeltons with equal probability
- Coupon collecting with n pulls and different probabilities
- Coupon collecting with different probabilities
- Coupon collecting with equal probability
- Adding Independent Normals Is Normal
- The value of fame during and after life
- Notes on the Beta Distribution
- Notes on the Gamma distribution
- Notes on Conditioning
- Notes on Independence
- A part of society
- Conditional Expectation and Prediction
- Notes on Covariance
- Deriving Simple Linear Regression
- Nature of the body
- Set Theory Basics
- Polynomial Regression
- The Negative Hyper Geometric RV
- Notes on the MVN
- Deriving the Cauchy density function
- Exponential and Geometric relationship
- Joint Distribution of Functions of RVs
- Order Statistics
- The Sample Mean and Sample Variance
- Probability that one RV is greater than another
- St Petersburg Paradox
- Drunk guy by a cliff
- The things that happen to us