Show that $Y_n = \sum_{k=1}^n X_k (\frac{1}{2})^k$ converges to $U(0,1)$

153 Views Asked by At

Problem. Let $X_1, \cdots, X_n$ be a sequence of independent Bernoulli random variables with success probability 0.5, i.e. $P(X_i = 0) = P(X_i = 1) = \frac{1}{2}$.

Define $Y_n = \sum_{k=1}^n X_k \left(\frac{1}{2}\right)^k$ for $n= 1, 2, ...$ Show that $Y_n$ converges in distribution to $U(0,1)$.

Attempt. So I'm thinking of proving using mgf (as the question hinted). I have found that $E(e^{-sY}) = \frac{1-e^{-s}}{s}, s \geq 0$ where $Y \sim U(0,1)$.

Now I try to show that $\lim_{n\to\infty}E(e^{-sY_n})=E(e^{-sY}) = \frac{1-e^{-s}}{s}$. So far I've got

$$E(e^{-sY_n}) = E(e^{-s\sum_{k=1}^n X_k \left(\frac{1}{2}\right)^k}) = \Pi_{i=1}^n E(e^{-sX_k (\frac{1}{2})^k})$$ and $$E(e^{-sX_k (\frac{1}{2})^k})=\frac{1}{2}e^{-s (\frac{1}{2})^k}$$ for all $k$. So I'm getting $E(e^{-sY_n})=(\frac{1}{2})^n e^{-s(1-0.5^n)}$ which goes to $0$ as $n$ approaches infinity.

I don't know if I made an error in calculations or something. Would appreciate if someone could point me in the right direction.