Bayes Billiard Balls (Probability)

293 Views Asked by At

I am reading the book Introduction to Probability by Joe Blitzstein, and I came across the following problem.

Using a story show that for any integers $k$ and $n$ with $0 \le k \le n$, \begin{equation*} \int_{0} ^ {1} \binom{n} {k} x ^ k (1 - x) ^ {n - k} \mathrm{d} x = \frac{1} {n + 1} \end{equation*}


Here is a solution given in the text book.


We will show that both the left-hand and right-hand sides are equal to $P(X = k)$ where $X$ is a r.v. that we will construct.

Story 1 : Start with $n+ 1$ balls, $n$ white and $1$ gray. Randomly throw each ball on to the unit interval $(0, 1)$, such that the positions of the balls are i.i.d. $\text{Unif}(0, 1)$. Let $X$ be the number of white balls to the left of gray ball. Note that $X$ is discrete with support $\{1, 2, \cdots, n \}$. To find $P(X = k)$, we use LOTP by conditioning on the position of the gray ball. Let $G$ be the position of the gray ball. Conditional on $G = p$, the random variable $X$ has $\text{Bin}(n, p)$ distribution since we can consider each of the white balls to be an independent Bernoulli trial, where success is defined as landing to the left of $p$. The random variable $G$ has PDF $f_G(p) = 1$ since $G \sim \text{Unif}(0, 1)$. Therefore \begin{equation*} P(X = k) = \int_{0} ^ {1} P(X = k | G = p) f_G(p) \mathrm{d} p = \int_{0} ^ 1\binom{n} {k} p ^ k (1 - p) ^ {n - k} \mathrm{d} p\end{equation*}

Story 2 : Start with $n + 1$ balls, all white. Randomly throw each ball onto to the unit interval. Then choose one ball at random and paint it gray. Again, let $X$ be the number of white balls to the left of gray ball. The event $\{ X = k \}$ is equivalent to the event "the $k + 1$th ball from the left is gray". Therefore \begin{equation*} P(X = k) = \frac{1} {n + 1} \end{equation*} for $k \in \{0, 1, \cdots , n \}$.

Since $X$ has the same distribution in both the stories we have \begin{equation*} \int_{0} ^ {1} \binom{n} {k} x ^ k (1 - x) ^ {n - k} \mathrm{d} x = \frac{1} {n + 1} \end{equation*} for $k \in \{0, 1, \cdots , n \}$.


I have the following questions.

  1. In Story 1 we have assumed that each ball is thrown randomly on to the unit interval $(0, 1)$, such that the positions of the balls are i.i.d. $\text{Unif}(0, 1)$. Where is the independence assumption used in the solution? In other words, will the same argument hold if we do not assume independence for position of balls?
  2. In Story 2 we have assumed that each ball is equally likely to be painted gray. How is this related to Story 1? In other words, how is the assumption that each ball is equally likely to be painted gray used to conclude that $X$ has the same distribution in both the stories?
1

There are 1 best solutions below

0
On

1: If the balls were not iid, then you cannot assume that, conditionally on the position of the gray ball, $X \sim \text{Bin}(n, p)$. It is in fact mentioned that you can make that assumption "[because] we can consider each of the white balls to be an independent Bernoulli trial". If they were somehow dependent, e.g. on the position of the gray ball, then the distribution of $X$ might be something completely different than binomial, even though the marginal distribution of each ball is still uniform in $(0, 1)$

2: In the first scenario, you place $n+1$ of balls on the unit interval, and one of them is gray. In the second, you place $n+1$ white balls and paint one random gray. It is quite intuitive that these are equivalent, as in both cases you have no reason to believe the gray one is more likely to be in one region than any other.