$X_n$ is the number of heads in the sequence of tosses before the tails appear for the $n$-th time. Check if $\frac{X_n}{n}$ converges in probability.

40 Views Asked by At

We consider an infinite sequence of tosses of a symmetrical coin. For a natural number $n$, let $X_n$ denote the number of all the heads that appear in the sequence before the tails appear for the $n$-th time. Decide whether the sequence of random variables $\frac{X_n}{n}$ converges in probability.

Let's say that in a sqeuence:

  • $n =$ number of tails,
  • $N =$ number of tosses,
  • so, $N-n =$ number of heads.

I know that by Bernoulli, to find probability for given $n$ and $N$, we need to put $n-1$ tails in $N$ places of sequence (because the last one must be tails), therefore: $$\mathbb{P}(X_n = N-n) = \displaystyle {{N}\choose{n-1}}\left( \frac{1}{2} \right)^{n-1} \left( \frac{1}{2} \right)^{N-(n-1)} \frac{1}{2} = {{N}\choose{n-1}} \left( \frac{1}{2} \right)^{N + 1}$$

Then we can say that with a probability defined by $n$ (for a given sum) and $N$ (for a given element of sum) we have $N-n$ heads, so:

$$X_n = \displaystyle \sum^{\infty}_{N=n-1} {{N}\choose{n-1}} \left( \frac{1}{2} \right)^{N + 1} (N-n)$$

And now, as I read from wikipedia:

A sequence ${X_n}$ of random variables converges in probability towards the random variable $X$ if for all $\epsilon > 0$: $\lim\limits_{n \to \infty} \mathbb{P}|(X_n - X)| > \epsilon = 0$

But how does it apply to my situation? Where to take $X$ from and how to check whether the condition works?

1

There are 1 best solutions below

0
On BEST ANSWER

You can think of the random variable $X_{n}$ as a sum of $n$ independent geometric random variables. Let $$Y_{j}=\text{number of trials since the $j-1-$success before the next succes}$$ Then $X_{n}=Y_{1}+\dots+Y_{n}$. The $Y_{j}$ i.i.d geometric random variables. Moreover $$E\left(\frac{X_{n}}{n}\right)=\frac{1}{n}\sum\limits_{j=0}^{n}E(Y_{j})=E(Y_{1})$$ and $$Var\left(\frac{X_{n}}{n}\right)=\frac{1}{n^{2}}Var\left(\sum Y_j\right)=\frac{1}{n}Var(Y_{1})$$ Here we used the $Y_{j}$ are i.i.d.

By Chebiyev's inequality $$P\left(\left|\frac{X_{n}}{n}-E(Y_{1})\right|\geq \varepsilon\right)\leq \frac{Var\left(\frac{X_{n}}{n}\right)}{\varepsilon^{2}}=\frac{Var(Y_{1})}{n\varepsilon^{2}}$$

So $\frac{X_{n}}{n}$ converges in probability to $E(Y_{1})$.

A much stronger result holds in fact, by the Law of large numbers $\frac{X_{n}}{n}$ converges a.s.