Thu 8 November 2018

Notes on the Gamma distribution

Written by Hongjinn Park in Articles

The Gamma RV is the sum of many Exponential RVs.

The interarrival times of a Poisson Process are iid Exponential RVs with rate $\lambda$. But what if we want to know the total time until $n$ arrivals. That is

$$T_n = \sum_{i=1}^n X_i$$

and we can get the pdf of $T_n$ by doing a convolution of $n$ exponential random variables. Let $X$ and $Y$ be independent Exponential RVs with rate $\lambda = 1$, then

$$f_{X + Y}(t) = \int_{0}^{\infty} f_X(x) f_Y(t-x)\, dx = \int_{0}^{t} e^{-x} e^{-t+x} \, dx = \int_{0}^{t} e^{-t} \, dx = e^{-t} \int_{0}^{t} \, dx = t e^{-t}$$

Continuing on, let $X \sim Gamma(2,1)$ and do a convolution with $Y \sim Expo(1)$.

$$f_{X + Y}(t) = \int_{0}^{\infty} xe^{-x} e^{-t+x} dx = e^{-t} \int_{0}^{t} x \, dx = e^{-t} \left[ \frac{x^2}{2} \right]_{0}^t = \frac{t^2}{2}e^{-t}$$

Again, let $X \sim Gamma(3,1)$ and do a convolution with $Y \sim Expo(1)$.

$$f_{X + Y}(t) = \int_{0}^{\infty} \frac{x^2}{2}e^{-x} e^{-t+x} dx = e^{-t} \int_{0}^{t} \frac{x^2}{2} \, dx = e^{-t} \left[ \frac{x^3}{6} \right]_{0}^t = \frac{t^3}{6}e^{-t}$$

Now we can see the pattern and say that $X \sim Gamma(n,1)$ has pdf

$$f_X(x) = \frac{x^{n-1}}{(n-1)!}e^{-x}$$

And we can prove this using induction. The base case where $X \sim Gamma(1,1)$ has pdf $f_X(x) = e^{-x}$ which is the pdf of an Exponential with rate $1$.

Given it works for $n$ show it works for $n+1$. We need to find the pdf of $X_1 + ... + X_n + X_{n+1}$. Using the assumption that it works for $n$, we note that this sum is the convolution of a $Gamma(n,1)$ and an Exponential with rate $1$. Using the convolution formula,

$$f_{X + Y}(t) = \int_{0}^{\infty} \frac{x^{n-1}}{(n-1)!}e^{-x} e^{-t+x} dx = \frac{e^{-t} }{(n-1)!}\int_{0}^{t} x^{n-1} \, dx = \frac{e^{-t} }{(n-1)!} \left[ \frac{x^n}{n} \right]_{0}^t = \frac{t^n}{n!}e^{-t}$$

which we recognize as the pdf of a $Gamma(n+1,1)$. Now to get the general formula, note that if $X \sim Gamma(n,1)$ then $Y = \frac{X}{\lambda}$ is $Gamma(n,\lambda)$.

$$F_Y(y) = P(Y \le y) = P \left( \frac{X}{\lambda} \le y \right) = P(X \le \lambda y) = F_X(\lambda y)$$

and taking derivatives,

$$f_Y(y) = \lambda f_X(\lambda y) = \lambda \frac{(\lambda y)^{n-1}}{(n-1)!}e^{-(\lambda y)} = \frac{\lambda e^{-\lambda y}(\lambda y)^{n-1}}{(n-1)!}$$

Also note that if $X \sim Gamma(n,\lambda)$ then $Y = \lambda X$ is $Gamma(n,1)$. Instead of using cdfs, we can show this by using the formula

$$f_Y(y) = f_X(x) \frac{dx}{dy} \quad \text{where} \quad x = g^{-1}(y)$$

Since we have $y = \lambda x$ this means that $x = \frac{y}{\lambda}$ and so $\frac{dx}{dy} = \frac{1}{\lambda}$ and so

$$f_{\lambda X}(x) = \frac{\lambda e^{-\lambda \frac{y}{\lambda}}(\lambda \frac{y}{\lambda})^{n-1}}{(n-1)!} \frac{1}{\lambda} = \frac{y^{n-1}}{(n-1)!}e^{-y}$$

which proves that $\lambda X$ is a $Gamma(n,1)$.

MGF of the Gamma

Let's get the MGF of $X \sim Gamma(a,1)$.

$$ M_X(t) = E[e^{tX}] = \int_0^\infty \frac{ x^{a-1}}{\Gamma(a)} e^{-x} e^{tx} \, dx = \frac{1}{\Gamma(a)} \int_0^\infty x^{a-1} e^{-x(1-t)} \, dx $$

and using substitution we have

$u = (1-t)x$

$x = \frac{u}{(1-t)}$

$du = (1-t) dx$

$dx = \frac{du}{(1-t)}$

and with $t<1$ we have

$$= \frac{1}{\Gamma(a)} \int_0^\infty \left( \frac{u}{(1-t)} \right) ^{a-1} e^{-u} \, \frac{du}{(1-t)} = \frac{1}{(1-t)^a} \int_0^\infty \frac{u^{a-1} e^{-u}}{\Gamma(a)} \, du = (1-t)^{-a}$$

So we have

$$M_X'(t) = a(1-t)^{-a-1} \quad \quad M_X'(0) = a$$ $$M_X''(t) = a(a+1)(1-t)^{-a-2} \quad \quad M_X''(0) = a^2+a$$

Therefore if $X \sim Gamma(a,1)$ then $E[X] = a$ and $Var(X) = a^2 + a - a^2 = a$

If $Y \sim Gamma(a, \lambda)$ then since $Y = \frac{X}{\lambda}$ we have $E[Y] = \frac{a}{\lambda}$ and $Var(Y) = \frac{a}{\lambda^2}$



Articles

Personal notes I've written over the years.