Proving convergence in probability.

258 Views Asked by At

Let $Y_k$ be independently Bernoulli distributed rvs holding $P(Y_k = 1) = \frac{1}{k}$ for every $k = 1,2,...,n$. I need to prove that the sum $\frac{1}{\log n} \sum_{k=1}^n Y_k$ converges in probability to $1$. How can I also check if it converges or not almost surely?

Well, I tried to use some well known inequalities but they don't work. Also Kolmogorov's theorem is of no use here.

1

There are 1 best solutions below

0
On BEST ANSWER

As you've mentioned in your comment convergence in probability is not interesting anymore. So, first we show that the series is indeed convergent in $\mathbf{R}$ a.s.


Denote $\tilde{Y}_k := \frac{Y_k}{\log n}$. To use Kolmogorov's two-series theorem we need to show that the series of expected values and variances converge in $\mathbf{R}$.

For the mean we have $$ \sum_{k=1}^n \mathbb{E}\tilde{Y}_k = \frac 1 {\log n} \left(1 + \frac 1 2 + \dots + \frac 1 n\right) \le C_n, $$ with $C_n \to 1$ as $n \to \infty$.

To see the latter inequality, note that $1 + \frac 1 2 + \dots + \frac 1 n$ is a harmonic series, which is known to have a pretty good approximation of $\log [n] + \gamma$, where $\gamma$ is Euler's constant. This approximation gets arbitrary small when $n$ gets larger. Hence, we have $$ \frac 1 {\log n} \left(1 + \frac 1 2 + \dots + \frac 1 n\right) \approx \frac{\log n + \gamma}{\log n} = 1 + \frac{\gamma}{\log n}, $$ so taking $C_n = c\left( 1 + \frac{\gamma}{\log n}\right)$ with some absolute constant $c$ large enough ensures the inequality for the sum of expected value. Hence, $\sum_{k=1}^{\infty} \mathbb{E}\tilde{Y}_k$ converges in $\mathbf{R}$.

Similarly, $$ \sum_{k=1}^n \mathbb{V}ar\tilde{Y}_k = \frac 1 {\log^2 n} \left(\underbrace{1 + \frac 1 2 + \dots + \frac 1 n}_{\log [n] + \gamma} - \underbrace{\left(1 + \frac 1 {2^2} + \dots + \frac 1 {n^2}\right)}_{\pi^2/6}\right) \to 0, \quad \text { as } n \to \infty. $$

So, the theorem implies that $\sum_{k=1}^{n} \frac 1 {\log n} Y_k$ converges a.s. when $n \to \infty$. Finally, since we know that convergence almost surely implies convergence in probability and there is convergence in probability to a constant $1$, then we have almost surely convergence to $1$ as well.