Joint Distribution of Functions of RVs
Let's say $U_1$ and $U_2$ are uniform RVs. By doing functions on them we can turn them into standard normal RVs. Very cool.
Requirements
If- $X_1$ and $X_2$ must have a joint pdf $f_{X_1,X_2}(x_1,x_2)$
- Given $Y_1 = g_1(X_1,X_2)$ and $Y_2 = g_2(X_1,X_2)$ the functions $g_1$ and $g_2$ can be uniquely solved for $x_1$ and $x_2$ in terms of $y_1$ and $y_1$ with solutions $x_1 = h_1(y_1,y_2)$ and $x_2 = h_2(y_1,y_2)$
- $J(x_1,x_2) \neq 0$
Then the joint density of $Y_1$ and $Y_2$ is the following.
$$f_{Y_1,Y_2}(y_1,y_2) = f_{X_1,X_2}(h_1(y_1,y_2),h_2(y_1,y_2))|J(h_1(y_1,y_2),h_2(y_1,y_2)|^{-1}$$Example
Two standard normal RVs as Cartesian coordinates (X,Y) into Polar coordinates (R,$\theta)$}
We can use the functions
$$Y_1 = \sqrt{X_1^2+X_2^2} = R$$ $$Y_2 = arctan\bigg(\frac{X_2}{X_1}\bigg) = \theta$$Step 1 is solve for $x_1$ and $x_2$ in terms of $y_1$ and $y_1$ which we can do using trigonometry.
$$X_1 = R cos(\theta)$$ $$X_2 = R sin(\theta)$$Step 2 is get the Jacobian.
$$\frac{\partial g_1}{\partial x_1} = 2x_1 \frac{1}{2} (x_1^2+x_2^2)^{-\frac{1}{2}} = \frac{x_1}{(x_1^2+x_2^2)^{\frac{1}{2}}}$$ $$\frac{\partial g_1}{\partial x_2} = 2x_2 \frac{1}{2} (x_1^2+x_2^2)^{-\frac{1}{2}} = \frac{x_2}{(x_1^2+x_2^2)^{\frac{1}{2}}}$$ $$\frac{\partial g_2}{\partial x_1} = \frac{-x_2}{x_1^2} \frac{1}{1+(\frac{x_2}{x_1})^2} = \frac{-x_2}{x_1^2 + x_2^2}$$ $$\frac{\partial g_2}{\partial x_2} = \frac{1}{x_1} \frac{1}{1+(\frac{x_2}{x_1})^2} = \frac{1}{x_1 + \frac{x_2^2}{x_1}} = \frac{x_1}{x_1^2 + x_2^2}$$Therefore the Jacobian is
$$J(x_1,x_2) = \frac{x_1}{(x_1^2+x_2^2)^{\frac{1}{2}}}\frac{x_1}{x_1^2 + x_2^2} - \frac{x_2}{(x_1^2+x_2^2)^{\frac{1}{2}}} \frac{-x_2}{x_1^2 + x_2^2}$$ $$= \frac{x_1^2}{(x_1^2 + x_2^2)^\frac{3}{2}} + \frac{x_2^2}{(x_1^2 + x_2^2)^\frac{3}{2}} = \frac{x_1^2+x_2^2}{(x_1^2 + x_2^2)^\frac{3}{2}} = \frac{1}{\sqrt{x_1^2+x_2^2}} = \frac{1}{R}$$And therefore
$$|J(x_1,x_2)|^{-1} = R$$And
$$f_{Y_1,Y_2} = R f_{X_1,X_2}(Rcos(\theta),Rsin(\theta)) $$ $$= R \frac{1}{\sqrt{2\pi}} \frac{1}{\sqrt{2\pi}} exp \bigg\{ -\frac{1}{2} (Rcos(\theta))^2 \bigg\} exp \bigg\{ -\frac{1}{2} (Rsin(\theta))^2 \bigg\}$$ $$=\frac{R}{2\pi}exp\bigg\{ -\frac{1}{2}(R^2cos^2(\theta)+R^2sin^2(\theta)) \bigg\} = \frac{R}{2\pi}e^{-\frac{1}{2}R^2}$$For $\theta \in (0,2\pi)$ and $0< r <\infty$
Note that $f_\theta(\theta) = \frac{1}{2\pi}$ since the angle can be anything around the circle with constant probability.
And therefore, $f_R(r) = re^{-r^2/2}$ which is apparently the pdf of a Rayleigh distribution with variance 1.
Example: Two uniforms into standard normals
Let's start off by showing that you can get an Exponential RV from a Uniform RV (and vice versa for any rate parameter apparently). Kind of random but note that the graph of $f(x) = e^{-\frac{x}{2}}$ is always less than 1 for positive $x$. Let $X \sim Exp(\lambda = \frac{1}{2})$ then
$$P(X \le x) = 1 - e^{-\frac{x}{2}} = P(U > e^{-\frac{x}{2}}) = P\bigg(ln(U) > -\frac{x}{2}\bigg) = P(-2ln(U) \le x) $$And therefore $X = -2ln(U)$ which is $\sim Exp(\lambda = \frac{1}{2})$.
Tuck that bad boy away. Now, let's look at going from Cartesian to Polar Coordinates if (X,Y) are standard normal RVs. For whatever reason let's look at $R^2$ instead of $R$ (because $R^2$ will give us an Exponential marginal density versus a Rayleigh and I don't know how to get a Rayleigh from a Uniform. The intuition behind this move would probably be from the Chi-squared).
$$Y_1 = X_1^2+X_2^2 = R^2$$ $$Y_2 = arctan\bigg(\frac{X_2}{X_1}\bigg) = \theta$$ It turns out that for $0 < R^2 < \infty$ and $\theta \in (0,2\pi)$ $$f_{R^2,\theta}(R^2,\theta) = \frac{1}{2\pi} \frac{1}{2} e^{-R^2/2}$$Notice that the marginal density of $\theta$ is uniform and the density of $R^2$ is exponential with parameters $(\lambda = \frac{1}{2})$. Side note, remember that an exponential with $\lambda = \frac{1}{2}$ is the same as a chi-squared distribution with 2 degrees of freedom. Double confirm this because $R^2 = X_1^2 + X_2^2$ where $X_1$ and $X_2$ are standard normal RVs which means that $R^2$ is a chi squared distribution with 2 degrees of freedom.
Therefore $R^2 \sim Exp(\lambda = \frac{1}{2})$ and the following transformation on $U_1$ which is $-2ln(U_1) = R^2$.
$$R = \sqrt{R^2} = \sqrt{-2ln(U_1)}$$ $$\Theta = 2\pi U_2$$Finally since $X_1 = Rcos(\theta)$ and $X_2 = Rsin(\theta)$ by plugging in the above you get
$$ X_1 = \sqrt{-2logU_1} cos(2\pi U_2) $$ $$ X_2 = \sqrt{-2logU_1}sin(2\pi U_2) $$where $X_1$ and $X_2$ are both standard normal RVs. Uniform $\rightarrow$ Exp $\rightarrow$ Chi Squared $\rightarrow$ Normal. Everything is connected and the web is holy.
Articles
Personal notes I've written over the years.
- When does the Binomial become approximately Normal
- Gambler's ruin problem
- The t-distribution becomes Normal as n increases
- Marcus Aurelius on death
- Proof of the Central Limit Theorem
- Proof of the Strong Law of Large Numbers
- Deriving Multiple Linear Regression
- Safety stock formula derivation
- Derivation of the Normal Distribution
- Comparing means of Normal populations
- Concentrate like a Roman
- How to read a Regression summary in R
- Notes on Expected Value
- How to read an ANOVA summary in R
- The time I lost faith in Expected Value
- Notes on Weighted Linear Regression
- How information can update Conditional Probability
- Coupon collecting singeltons with equal probability
- Coupon collecting with n pulls and different probabilities
- Coupon collecting with different probabilities
- Coupon collecting with equal probability
- Adding Independent Normals Is Normal
- The value of fame during and after life
- Notes on the Beta Distribution
- Notes on the Gamma distribution
- Notes on Conditioning
- Notes on Independence
- A part of society
- Conditional Expectation and Prediction
- Notes on Covariance
- Deriving Simple Linear Regression
- Nature of the body
- Set Theory Basics
- Polynomial Regression
- The Negative Hyper Geometric RV
- Notes on the MVN
- Deriving the Cauchy density function
- Exponential and Geometric relationship
- Joint Distribution of Functions of RVs
- Order Statistics
- The Sample Mean and Sample Variance
- Probability that one RV is greater than another
- St Petersburg Paradox
- Drunk guy by a cliff
- The things that happen to us