Notes on Conditioning
So many things spring from the basic definition of conditional probability.
Conditioning of events
$$P(A|B) + P(A^c|B) = 1$$ $$P(E|H) = P(E|FH)P(F|H) + P(E|F^cH)P(F^c|H)$$ $$Proof \quad P(E|H) = \frac{P(EH)}{P(H)} = \frac{P(EHF \cup EHF^c)}{P(H)} = \frac{P(EFH)+P(EF^cH)}{P(H)} = \frac{P(E|FH)P(FH)+P(E|F^cH)P(F^cH)}{P(H)} $$Bayes
Generally speaking the probability of hypothesis "i" given new evidence:
$$P(H_i|E) = \frac{P(H_iE)}{P(E)}= \frac{P(E|H_i)P(H_i)}{P(E)} = \frac{P(E|H_i)P(H_i)}{P(E|H_i)P(H_i)+P(E|H_i^c)P(H_i^c)} = \frac{P(E|H_i)P(H_i)}{\sum_{j} P(E|H_j)P(H_j)}$$Where the different hypotheses $H_j$ are mutually exclusive AND exhaustive.
$$P(E) = P(EH_i \cup EH_i^c) = P(EH_i) + P(EH_i^c) = P(E|H_i)P(H_i)+P(E|H_i^c)P(H_i^c)$$Note in the above where $H_i$ and $H_i^c$ are two hypothesis (for a specific i, like hypothesis 74, and either 74 must happen or it didn't (exhaustive)) that are mutually exclusive and exhaustive since $H_i \cup H_i^c = S$ by definition of complement.
Odds and probability can be confusing, you read a portion of your first stats article on this. So the odds of rolling a six is 1/5 to 1, meaning you are 1/5th as likely to roll a six than not. Kind of confusing. Rolling a "not 6" is 5/6, and you are 1/5 as likely to roll a 6 which is 1/6. You are 5 times more likely to roll a "not 6" then a 6.
$$Odds = \frac{P(A)}{P(A^{c})} \quad \frac{\frac{1}{6}}{\frac{5}{6}} \quad oddsRollSIX$$Or 5 to 1 odds that you don't roll a six if you define event A as not rolling a 6.
Note from definitions,
$$\frac{P(H|E)}{P(H^{c}|E)} = \frac{P(H)}{P(H^{c})}\frac{P(E|H)}{P(E|H^{c})}$$Which says, that the new odds of H given a new piece of evidence is the old odds times the ratio of $\frac{P(E|H)}{P(E|H^{c})}$. Therefore H becomes more likely with new E if the evidence was more likely to happen given the hypothesis versus if the hypothesis was false.
For example, a piece of evidence in a court case is introduced. It supports the hypothesis that the dude is guilty af. Jury is like "kill this man." But, if the hypothesis was not true, then the evidence would have been even \textit{more} likely. If he had been innocent, then the evidence is even more likely to have happened! Therefore, it actually waters down the original estimation of his guilt.
Let's say a man is on trial for theft at the san diego UTC mall. New evidence says he was on planet Earth at the time of theft. This supports the hypothesis that he is guilty. However, the evidence (he was on planet Earth) was equally as likely given he is actually innocent or guilty.
The hypothesis becomes more likely with new evidence: $P(H|E) > P(H)$ only if $P(E|H) > P(E|H^{c})$
Prior hypothesis $= P(H)$
Posterior (or after) hypothesis $= P(H|E)$
* Let $G$ be the event a person is guilty. Is it possible that two pieces of evidence separately increase the likelihood of $G$ but together decrease it? That is $P(G|E_1) > P(G)$ and $P(G|E_2) > P(G)$ however $P(G|E_1E_2) < P(G)$? Yes it is possible. Say that we know the crime went down between 1 and 3pm. Let $E_1$ be the event our person was at a nearby coffee shop between 1 and 2pm and $E_2$ be the event our person was by the coffee shop between 2 and 3pm. If $E_1$ and $E_2$ then our person has an alibi the whole time.
Articles
Personal notes I've written over the years.
- When does the Binomial become approximately Normal
- Gambler's ruin problem
- The t-distribution becomes Normal as n increases
- Marcus Aurelius on death
- Proof of the Central Limit Theorem
- Proof of the Strong Law of Large Numbers
- Deriving Multiple Linear Regression
- Safety stock formula derivation
- Derivation of the Normal Distribution
- Comparing means of Normal populations
- Concentrate like a Roman
- How to read a Regression summary in R
- Notes on Expected Value
- How to read an ANOVA summary in R
- The time I lost faith in Expected Value
- Notes on Weighted Linear Regression
- How information can update Conditional Probability
- Coupon collecting singeltons with equal probability
- Coupon collecting with n pulls and different probabilities
- Coupon collecting with different probabilities
- Coupon collecting with equal probability
- Adding Independent Normals Is Normal
- The value of fame during and after life
- Notes on the Beta Distribution
- Notes on the Gamma distribution
- Notes on Conditioning
- Notes on Independence
- A part of society
- Conditional Expectation and Prediction
- Notes on Covariance
- Deriving Simple Linear Regression
- Nature of the body
- Set Theory Basics
- Polynomial Regression
- The Negative Hyper Geometric RV
- Notes on the MVN
- Deriving the Cauchy density function
- Exponential and Geometric relationship
- Joint Distribution of Functions of RVs
- Order Statistics
- The Sample Mean and Sample Variance
- Probability that one RV is greater than another
- St Petersburg Paradox
- Drunk guy by a cliff
- The things that happen to us