Gambler's ruin problem
Two gamblers bet one dollar at a time on the outcome of a coin. Initial wealth of Gambler A is $i$ dollars. Gambler B initial wealth is $N-i$ dollars. They stop when someone wins all the money. What is the probability Gambler A ends up with all the money?
Let $P_i$ denote the probability Gambler A ends up with all the money given that he starts with $i$ dollars. Conditioning on the first step, we have
$$P_i = pP_{i+1} + qP_{i-1} \quad \quad i = 1,2,...,N-1$$with boundary conditions $P_0 = 0$ and $P_N = 1$. We have what is called a linear difference equation. First, cleverly note that $p+q=1$, so $P_i = (1)P_i = (p+q)P_i$ and
$$ \begin{aligned} (p+q)P_i & = pP_{i+1} + qP_{i-1} \\ pP_i+qP_i & = pP_{i+1} + qP_{i-1} \\ pP_{i+1} - pP_i & = qP_i - qP_{i-1} \\ P_{i+1} - P_i & = \frac{q}{p}(P_i - P_{i-1}) \\ \end{aligned} $$Since $P_0 = 0$, we have
$$ \begin{aligned} P_2 - P_1 & = \frac{q}{p}(P_1 - P_0) = \frac{q}{p} P_1 \\ P_3 - P_2 & = \frac{q}{p}(P_2 - P_1) = \frac{q}{p} \left( \frac{q}{p} P_1 \right) = \left( \frac{q}{p} \right)^2 P_1 \\ P_4 - P_3 & = \frac{q}{p}(P_3 - P_2) = \frac{q}{p} \left( \left( \frac{q}{p} \right)^2 P_1 \right) = \left( \frac{q}{p} \right)^3 P_1 \\ & \vdots \\ P_i - P_{i-1} & = \frac{q}{p}(P_{i-1} - P_{i-2}) = \left( \frac{q}{p} \right)^{i-1} P_1 \\ & \vdots \\ P_N - P_{N-1} & = \frac{q}{p}(P_{N-1} - P_{N-2}) = \left( \frac{q}{p} \right)^{N-1} P_1 \\ \end{aligned} $$Summing up the first $i-1$ equations we get
$$P_i - P_1 = P_1 \left[\frac{q}{p} + \left( \frac{q}{p} \right)^2 + \left( \frac{q}{p} \right)^3 + ... + \left( \frac{q}{p} \right)^{i-1}\right]$$ $$P_i = P_1 \left[\frac{q}{p} + \left( \frac{q}{p} \right)^2 + \left( \frac{q}{p} \right)^3 + ... + \left( \frac{q}{p} \right)^{i-1}\right] + P_1$$ $$P_i = P_1 \left[1 + \frac{q}{p} + \left( \frac{q}{p} \right)^2 + \left( \frac{q}{p} \right)^3 + ... + \left( \frac{q}{p} \right)^{i-1}\right]$$The formula for the sum of the first $n$ terms of a geometric series is
$$\sum_{i=0}^{n-1} ar^i = a(1 + r + r^2 + ... + r^{n-1}) = a \left( \frac{1-r^n}{1-r} \right) \quad \text{if } r \neq 1$$Therefore if $r = \frac{q}{p} \neq 1$ we have
$$P_i = P_1 \left( \frac{1- \left( \frac{q}{p} \right)^i}{1- \frac{q}{p}} \right) \quad \text{if } \frac{q}{p} \neq 1$$In the case where $\frac{q}{p} = 1$ we have $P_i = P_1 [1+1+...+1] = iP_1$ and thus
$$ P_i = \begin{cases} \frac{1- \left( \frac{q}{p} \right)^i}{1- \frac{q}{p}} P_1 & \text{if } \frac{q}{p} \neq 1\\ iP_1 & \text{if } \frac{q}{p} = 1\\ \end{cases} $$But we are not done because the solution to our linear difference equation should not have a $P_k$ in it. Using the other boundary condition $P_N = 1$, we have
$$ P_N = 1 = \begin{cases} \frac{1- \left( \frac{q}{p} \right)^N}{1- \frac{q}{p}} P_1 & \text{if } \frac{q}{p} \neq 1 \\ NP_1 & \text{if } \frac{q}{p} = 1\\ \end{cases} $$ Therefore $$ P_1 = \begin{cases} \frac{1- \frac{q}{p}}{1- \left( \frac{q}{p} \right)^N} & \text{if } \frac{q}{p} \neq 1\\ \frac{1}{N} & \text{if } \frac{q}{p} = 1\\ \end{cases} $$ And finally we have $$ P_i = \begin{cases} \frac{1- \left( \frac{q}{p} \right)^i}{1- \frac{q}{p}} \cdot \frac{1- \frac{q}{p}}{1- \left( \frac{q}{p} \right)^N} = \frac{1- \left( \frac{q}{p} \right)^i}{1- \left( \frac{q}{p} \right)^N} & \text{if } \frac{q}{p} \neq 1\\ i \cdot \frac{1}{N} = \frac{i}{N} & \text{if } \frac{q}{p} = 1\\ \end{cases} $$For example, let's say Gambler A has $1$ dollar and Gambler B has $999$ dollars.
\begin{array}{c|c|c} \hline \text{Probability A wins a coin toss} & q/p & \text{A's chance to win it all} \\ \hline .5 \text{ (fair game)} & 1 & 1/1000 \\ .1 \text{ (A sad)} & 9 & \text{basically zero} \\ .9 \text{ (A happy)} & .1111 & \text{almost } 8/9 \\ \end{array}So even though A has $\$1$ and B has $\$999$, if A can win each coin toss with $90\%$ probability, then A has a significantly better shot at winning everything than B. Poor B is only one dollar away from sweet victory, yet only has $\approx .111$ chance of not becoming broke if he has a $10\%$ chance of winning each coin toss. If B doesn't outright win on the very first toss of the game then things deteriorate very quickly. Being two away and then winning the next two trials has probability $.1^2 = 1\%$ of occurring. His probability of ever winning in this situation cannot be better than $1/81 \approx 1.235\%$.
It doesn't help Gambler B that much if you increase his starting wealth. In the example above he had $\$999$ dollars and was one away. What if he has $\$999,999,999$ and was one away from total victory?
If he still has only a $10\%$ chance of winning each coin toss then $q/p = .9/.1 = 9$ and no matter how big his initial wealth $k$ is his chance of winning can never be better than $1/9$.
$$P(\text{Gambler B wins all}) = \frac{1-9^k}{1-9^{k+1}} \stackrel{?}{<} \frac{1}{9}$$To show it's true, the above statement in question is equivalent to
$$ \frac{9(1-9^k)}{1-9^{k+1}} < 1$$ $$ 9(1-9^k) > 1-9^{k+1}$$Notice that the inequality switched because the quantity $1-9^{k+1}$ is negative. Continuing,
$$ 9-9^{k+1} > 1- 9^{k+1}$$which is a true statement. So no matter how big the initial wealth $k$ is, the chances of winning it all never exceeds $1/9$.
Articles
Personal notes I've written over the years.
- When does the Binomial become approximately Normal
- Gambler's ruin problem
- The t-distribution becomes Normal as n increases
- Marcus Aurelius on death
- Proof of the Central Limit Theorem
- Proof of the Strong Law of Large Numbers
- Deriving Multiple Linear Regression
- Safety stock formula derivation
- Derivation of the Normal Distribution
- Comparing means of Normal populations
- Concentrate like a Roman
- How to read a Regression summary in R
- Notes on Expected Value
- How to read an ANOVA summary in R
- The time I lost faith in Expected Value
- Notes on Weighted Linear Regression
- How information can update Conditional Probability
- Coupon collecting singeltons with equal probability
- Coupon collecting with n pulls and different probabilities
- Coupon collecting with different probabilities
- Coupon collecting with equal probability
- Adding Independent Normals Is Normal
- The value of fame during and after life
- Notes on the Beta Distribution
- Notes on the Gamma distribution
- Notes on Conditioning
- Notes on Independence
- A part of society
- Conditional Expectation and Prediction
- Notes on Covariance
- Deriving Simple Linear Regression
- Nature of the body
- Set Theory Basics
- Polynomial Regression
- The Negative Hyper Geometric RV
- Notes on the MVN
- Deriving the Cauchy density function
- Exponential and Geometric relationship
- Joint Distribution of Functions of RVs
- Order Statistics
- The Sample Mean and Sample Variance
- Probability that one RV is greater than another
- St Petersburg Paradox
- Drunk guy by a cliff
- The things that happen to us