Fri 24 June 2016

Joint Distribution of Functions of RVs

Written by Hongjinn Park in Articles

Let's say $U_1$ and $U_2$ are uniform RVs. By doing functions on them we can turn them into standard normal RVs. Very cool.


Requirements

If
  • $X_1$ and $X_2$ must have a joint pdf $f_{X_1,X_2}(x_1,x_2)$
  • Given $Y_1 = g_1(X_1,X_2)$ and $Y_2 = g_2(X_1,X_2)$ the functions $g_1$ and $g_2$ can be uniquely solved for $x_1$ and $x_2$ in terms of $y_1$ and $y_1$ with solutions $x_1 = h_1(y_1,y_2)$ and $x_2 = h_2(y_1,y_2)$
  • $J(x_1,x_2) \neq 0$

Then the joint density of $Y_1$ and $Y_2$ is the following.

$$f_{Y_1,Y_2}(y_1,y_2) = f_{X_1,X_2}(h_1(y_1,y_2),h_2(y_1,y_2))|J(h_1(y_1,y_2),h_2(y_1,y_2)|^{-1}$$

Example

Two standard normal RVs as Cartesian coordinates (X,Y) into Polar coordinates (R,$\theta)$}

We can use the functions

$$Y_1 = \sqrt{X_1^2+X_2^2} = R$$ $$Y_2 = arctan\bigg(\frac{X_2}{X_1}\bigg) = \theta$$

Step 1 is solve for $x_1$ and $x_2$ in terms of $y_1$ and $y_1$ which we can do using trigonometry.

$$X_1 = R cos(\theta)$$ $$X_2 = R sin(\theta)$$

Step 2 is get the Jacobian.

$$\frac{\partial g_1}{\partial x_1} = 2x_1 \frac{1}{2} (x_1^2+x_2^2)^{-\frac{1}{2}} = \frac{x_1}{(x_1^2+x_2^2)^{\frac{1}{2}}}$$ $$\frac{\partial g_1}{\partial x_2} = 2x_2 \frac{1}{2} (x_1^2+x_2^2)^{-\frac{1}{2}} = \frac{x_2}{(x_1^2+x_2^2)^{\frac{1}{2}}}$$ $$\frac{\partial g_2}{\partial x_1} = \frac{-x_2}{x_1^2} \frac{1}{1+(\frac{x_2}{x_1})^2} = \frac{-x_2}{x_1^2 + x_2^2}$$ $$\frac{\partial g_2}{\partial x_2} = \frac{1}{x_1} \frac{1}{1+(\frac{x_2}{x_1})^2} = \frac{1}{x_1 + \frac{x_2^2}{x_1}} = \frac{x_1}{x_1^2 + x_2^2}$$

Therefore the Jacobian is

$$J(x_1,x_2) = \frac{x_1}{(x_1^2+x_2^2)^{\frac{1}{2}}}\frac{x_1}{x_1^2 + x_2^2} - \frac{x_2}{(x_1^2+x_2^2)^{\frac{1}{2}}} \frac{-x_2}{x_1^2 + x_2^2}$$ $$= \frac{x_1^2}{(x_1^2 + x_2^2)^\frac{3}{2}} + \frac{x_2^2}{(x_1^2 + x_2^2)^\frac{3}{2}} = \frac{x_1^2+x_2^2}{(x_1^2 + x_2^2)^\frac{3}{2}} = \frac{1}{\sqrt{x_1^2+x_2^2}} = \frac{1}{R}$$

And therefore

$$|J(x_1,x_2)|^{-1} = R$$

And

$$f_{Y_1,Y_2} = R f_{X_1,X_2}(Rcos(\theta),Rsin(\theta)) $$ $$= R \frac{1}{\sqrt{2\pi}} \frac{1}{\sqrt{2\pi}} exp \bigg\{ -\frac{1}{2} (Rcos(\theta))^2 \bigg\} exp \bigg\{ -\frac{1}{2} (Rsin(\theta))^2 \bigg\}$$ $$=\frac{R}{2\pi}exp\bigg\{ -\frac{1}{2}(R^2cos^2(\theta)+R^2sin^2(\theta)) \bigg\} = \frac{R}{2\pi}e^{-\frac{1}{2}R^2}$$

For $\theta \in (0,2\pi)$ and $0< r <\infty$

Note that $f_\theta(\theta) = \frac{1}{2\pi}$ since the angle can be anything around the circle with constant probability.

And therefore, $f_R(r) = re^{-r^2/2}$ which is apparently the pdf of a Rayleigh distribution with variance 1.

Example: Two uniforms into standard normals

Let's start off by showing that you can get an Exponential RV from a Uniform RV (and vice versa for any rate parameter apparently). Kind of random but note that the graph of $f(x) = e^{-\frac{x}{2}}$ is always less than 1 for positive $x$. Let $X \sim Exp(\lambda = \frac{1}{2})$ then

$$P(X \le x) = 1 - e^{-\frac{x}{2}} = P(U > e^{-\frac{x}{2}}) = P\bigg(ln(U) > -\frac{x}{2}\bigg) = P(-2ln(U) \le x) $$

And therefore $X = -2ln(U)$ which is $\sim Exp(\lambda = \frac{1}{2})$.

Tuck that bad boy away. Now, let's look at going from Cartesian to Polar Coordinates if (X,Y) are standard normal RVs. For whatever reason let's look at $R^2$ instead of $R$ (because $R^2$ will give us an Exponential marginal density versus a Rayleigh and I don't know how to get a Rayleigh from a Uniform. The intuition behind this move would probably be from the Chi-squared).

$$Y_1 = X_1^2+X_2^2 = R^2$$ $$Y_2 = arctan\bigg(\frac{X_2}{X_1}\bigg) = \theta$$ It turns out that for $0 < R^2 < \infty$ and $\theta \in (0,2\pi)$ $$f_{R^2,\theta}(R^2,\theta) = \frac{1}{2\pi} \frac{1}{2} e^{-R^2/2}$$

Notice that the marginal density of $\theta$ is uniform and the density of $R^2$ is exponential with parameters $(\lambda = \frac{1}{2})$. Side note, remember that an exponential with $\lambda = \frac{1}{2}$ is the same as a chi-squared distribution with 2 degrees of freedom. Double confirm this because $R^2 = X_1^2 + X_2^2$ where $X_1$ and $X_2$ are standard normal RVs which means that $R^2$ is a chi squared distribution with 2 degrees of freedom.

Therefore $R^2 \sim Exp(\lambda = \frac{1}{2})$ and the following transformation on $U_1$ which is $-2ln(U_1) = R^2$.

$$R = \sqrt{R^2} = \sqrt{-2ln(U_1)}$$ $$\Theta = 2\pi U_2$$

Finally since $X_1 = Rcos(\theta)$ and $X_2 = Rsin(\theta)$ by plugging in the above you get

$$ X_1 = \sqrt{-2logU_1} cos(2\pi U_2) $$ $$ X_2 = \sqrt{-2logU_1}sin(2\pi U_2) $$

where $X_1$ and $X_2$ are both standard normal RVs. Uniform $\rightarrow$ Exp $\rightarrow$ Chi Squared $\rightarrow$ Normal. Everything is connected and the web is holy.



Articles

Personal notes I've written over the years.