Notes on Covariance
It's helpful for me to keep in mind that Variance and Covariance are just Expectations or weighted averages to help describe a distribution.
Definition first, which I've found appears naturally when you calculate $Var(X+Y) = E[(X+Y - \mu_X -\mu_Y)^2]$
$$Cov(X,Y) = E[(X- E[X])(Y - E[Y])]$$And by expanding the RHS we see that:
$$Cov(X,Y) = E[XY] - E[X]E[Y]$$Note: if $X$ and $Y$ are independent, $E[g(X)h(Y)] = E[g(X)]E[h(Y)]$ and so $Cov(X,Y) = 0$.
Independence $\Rightarrow Cov(X,Y) = 0$ always, however the converse is not true!
$Cov(X,Y) = 0$ does not always mean independence! For example, $X$ is a RV such that $P(X=0) = P(X=1) = P(X=-1) = \frac{1}{3}$ and $Y = 0$ if $X \neq 0$ and $Y=1$ if $X=0$. Clearly $X$ and $Y$ are dependent but $E[XY] = E[X] = 0$ and so $Cov(X,Y) = 0$.
Properties
$$Cov(X,Y) = Cov(Y,X)$$ $$Cov(X,X) = Var(X)$$ $$Cov(aX+j,bY+k) = ab Cov(X,Y)$$ $$Cov(aX+bY,cW+dV) = ac Cov(X,W) + ad Cov(X,V) + bc Cov(Y,W) + bd Cov(Y,V) $$ $$Cov\bigg(\sum_{i=1}^{n}X_i,\sum_{j=1}^{m}Y_j\bigg) = \sum_{i=1}^{n}\sum_{j=1}^{m}Cov(X_i,Y_j)$$ $$Cov(X_1 + X_2, Y_1 + Y_2 + Y_3) = Cov(X_1,Y_1) + Cov(X_1,Y_2) + Cov(X_1,Y_3) + Cov(X_2,Y_1) + Cov(X_2,Y_2) + Cov(X_2,Y_3)$$ $$Var\bigg(\sum_{i=1}^{n}X_i\bigg) = \sum_{i=1}^{n} Var(X_i) + 2 \sum_{i < j } Cov(X_i,X_j)$$ Also since $Var(X_1+X_2)=Cov(X_1+X_2,X_1+X_2)$, $$Var\bigg(\sum_{i=1}^{n}X_i\bigg) = Cov\bigg(\sum_{i=1}^{n}X_i,\sum_{j=1}^{n}X_j\bigg) = \sum_{i=1}^{n}\sum_{j=1}^{n}Cov(X_i,X_j) $$ $$Var(X+Y+Z) = Var(X) + Var(Y) + Var(Z) + 2\bigg(Cov(X,Y) + Cov(X,Z) + Cov(Y,Z)\bigg)$$Articles
Personal notes I've written over the years.
- When does the Binomial become approximately Normal
- Gambler's ruin problem
- The t-distribution becomes Normal as n increases
- Marcus Aurelius on death
- Proof of the Central Limit Theorem
- Proof of the Strong Law of Large Numbers
- Deriving Multiple Linear Regression
- Safety stock formula derivation
- Derivation of the Normal Distribution
- Comparing means of Normal populations
- Concentrate like a Roman
- How to read a Regression summary in R
- Notes on Expected Value
- How to read an ANOVA summary in R
- The time I lost faith in Expected Value
- Notes on Weighted Linear Regression
- How information can update Conditional Probability
- Coupon collecting singeltons with equal probability
- Coupon collecting with n pulls and different probabilities
- Coupon collecting with different probabilities
- Coupon collecting with equal probability
- Adding Independent Normals Is Normal
- The value of fame during and after life
- Notes on the Beta Distribution
- Notes on the Gamma distribution
- Notes on Conditioning
- Notes on Independence
- A part of society
- Conditional Expectation and Prediction
- Notes on Covariance
- Deriving Simple Linear Regression
- Nature of the body
- Set Theory Basics
- Polynomial Regression
- The Negative Hyper Geometric RV
- Notes on the MVN
- Deriving the Cauchy density function
- Exponential and Geometric relationship
- Joint Distribution of Functions of RVs
- Order Statistics
- The Sample Mean and Sample Variance
- Probability that one RV is greater than another
- St Petersburg Paradox
- Drunk guy by a cliff
- The things that happen to us