The independent binary random variables $X_k$ take values $\pm 1$ with the probabilities $(1 \pm k^{\frac{-1}{2}})/2, k = 1,2,...$. Find $\lim_{n \rightarrow \infty} \mathbb{P}(X_1 + ... + X_n \leq 0)$.
Here are my thoughts so far:
I'd like to use the Central Limit Theorem here -- if $X_1,...,X_n$ are a sequence of i.i.d. random variables, each with mean $\mu$ and variance $\sigma^2$, then the distribution of $\frac{X_1 + ... + X_n - n\mu}{\sigma \sqrt{n}}$ tends to the standard normal as $n \rightarrow \infty$.
However, here, although $X_1,..,X_n$ are independent, they are not identically distributed.
I first let $Y_k = \frac{X_k + 1}{2}$, so that $P(Y _k = 0) = \frac{1 - k^{\frac{-1}{2}}}{2}$ and $P(Y _k = 1) = \frac{1 + k^{\frac{-1}{2}}}{2}$. This gives that $Y_k$ is a Bernoulli random variable with probability of success $\frac{1 + k^{\frac{-1}{2}}}{2}$. This makes our calculations easier to handle, since we're now working with random variables we're familiar with -- but still, $Y_1,...,Y_n$ are independent but not identically distributed.
Where can I continue from here ? Another idea is to compute the moment generating function of $X_1 + ... + X_n$, and map this back to the p.d.f. of a familiar random variable by the uniqueness theorem -- but, the resulting moment generating function doesn't map to such a familiar p.d.f., as the sum of Bernoulli random variables with different success probabilities does not give a nice, familiar distribution. Thus, I'm sticking with my instinct that the Central Limit Theorem is the way to go here.
Any help would be appreciated. Thanks!
The random variables $\ X_n\ $ satisfy the Lindeberg condition, and hence the central limit theorem holds, in the form $$ \frac{\sum_{k=1}^n\left(X_k-\mathbb{E}\left(X_k\right)\right)}{s_n}\overset{\mathcal D}{\rightarrow}\mathcal{N}(0,1)\ \ \text{ as }\ n\rightarrow\infty\ , $$ where $\ s_n=\sqrt{\sum_\limits{k=1}^n \mathbb{E} \left(\left(X_k-\mathbb{E}\left(X_k\right)\right)^2 \right)}\ $.
Here, $$ \mathbb{E}\left(X_n\right)=\frac{1}{\sqrt{n}}\ \ \text{and}\\ s_n^2=n-\sum_{k=1}^n\frac{1}{k}\ , $$ so $$ \lim_\limits{n\rightarrow\infty} \frac{\sum_{k=1}^n \mathbb{E}\left(X_k\right)}{s_n} = \lim_\limits{n\rightarrow\infty}\frac{\sum_{k=1}^n \frac{1}{\sqrt{k}}}{s_n}=2\ , $$ and \begin{align} \lim_\limits{n\rightarrow\infty}\mathbb{P}\left(\sum_{k=1}^nX_k\le0\right)&= \lim_\limits{n\rightarrow\infty}\mathbb{P}\left(\frac{\sum_{k=1}^n\left(X_k-\mathbb{E}\left(X_k\right)\right)}{s_n}\le-\frac{\sum_{k=1}^n \frac{1}{\sqrt{k}}}{s_n}\right)\\ &= \mathcal{N}\left(0,1;-2\right)\\ &\approx0.023\ . \end{align} That $\ X_n\ $ satisfy the Lindeberg condition is not obvious, but unless there's a blunder in my arithmetic (of which the probability is not entirely negligible) the proof is fairly straightforward. I'll post it here if there's a request for me to do so.