Convergence of the probability of a R.V.

112 Views Asked by At

I'm trying to solve a certain problem but I'm stuck at a point.

The problem is: I have a uniform r.v. $U$ on [$-\pi,+\pi$] and I have a sequence of r.vs $X_1,X_2...$ where $X_k=cos(kU)$. Then I have $S_n=X_1+X_2+...+X_n$.

The problem is to find what is $lim_{n\rightarrow \infty} P{(|\frac{S_n}n|<\epsilon)}$

My attempt:

I know that $E[\frac{S_n}n]=0$ and its variance is $2\pi^2/n$.

I know that $P{(|\frac{S_n}n|<\epsilon)} =P_{\frac{S_n}n}([-\epsilon,+\epsilon])= \begin{matrix} \int_{-\epsilon}^{\epsilon} f_{\frac{S_n}n}(x) dx \end{matrix}$

Where $f_{\frac{S_n}n}(x)$ is the probability density of $\frac{S_n}n$.

I don't know how to find this probability density: I tried with the characteristic function of the sum but the result is not very "writable".

Maybe there is another way to find that probability with $n\rightarrow \infty$?

Thank you

2

There are 2 best solutions below

2
On BEST ANSWER

First, note that $$ \mathbb{E} X_k = \frac{1}{2\pi} \int_{-\pi}^\pi \cos(ku) \,du = \frac{\sin(ku)}{2\pi k}\bigg|_{u=-\pi}^{u=\pi} = 0 $$ because $\sin(x) = -\sin(-x)$. It follows from the linearity of expectations that $$ \mathbb{E}S_n = 0 \text{.} $$

Next, observe that $X_k,X_m$ are uncorrelated (if $k\neq m$) because $$ \textrm{cov}(X_k,X_m) = \mathbb{E}\left((X_k - \mathbb{E}X_k)(X_m - \mathbb{E}X_m)\right) = \frac{1}{2\pi} \int_{-\pi}^\pi \cos(ku)\cos(mu) \,du = \frac{1}{2}\delta_{k,m} $$ where $\delta_{k,m}$ is the Kronecker delta, i.e. $1$ if $k=m$ and $0$ otherwise. Note that we also computed $\mathbb{V}X_k = \textrm{cov}(X_k,X_k) = \frac{1}{2}$. It follows that $$ \mathbb{V}S_n = \sum_{k=1}^n \mathbb{V}X_k = \frac{n}{2} \text{.} $$

We thus have that $$ \mathbb{E} \tfrac{S_n}{n} = 0 \quad\text{ and }\quad \mathbb{V} \tfrac{S_n}{n} = \tfrac{1}{n^2} \mathbb{V} S_n = \tfrac{1}{2n} \text{,} $$ and apply Chebyshev's inequality for random variables $X$ with finite expectation $\mu = \mathbb{E} X$ and finite variance $\sigma^2 = \mathbb{V}X$ $$ \mathbb{P}\left(|X - \mu| \geq t\sigma\right) \leq \frac{1}{t^2} \text{.} $$ This yields $$ \mathbb{P}\left(\left|\tfrac{S_n}{n}\right| \geq \frac{t}{\sqrt{2n}}\right) \leq \frac{1}{t^2} \text{,} $$ and for $\epsilon = \frac{t}{\sqrt{2n}}$, i.e. $t = \epsilon\sqrt{2n}$ we get $$ \mathbb{P}\left(\left|\tfrac{S_n}{n}\right| \geq \epsilon\right) \leq \frac{1}{2n\epsilon^2} \to 0 \text{ as $n \to \infty$.} $$ Therefore, $$ \lim_{n\to\infty} \mathbb{P}\left(\left|\tfrac{S_n}{n}\right| < \epsilon \right) = 1 \text{.} $$

Note that this argument doesn't depend on the particular form of the $X_k$. It suffices to show that $\mathbb{V} S_n = o(n^2)$, because then $\mathbb{V} \frac{S_n}{n} \to 0$ as $n \to \infty$, which is all that is required. In particular, the argument works for any uncorrelated sequence of $X_k$ with a global $M$ bound on $\mathbb{V}X_k$, because then $\mathbb{V} S_n \leq M \cdot n = o(n^2)$.

2
On

If $U\ne0$ and $|U|\leqslant\pi$ then $\mathrm e^{\mathrm iU}\ne1$ hence $$ S_n=\Re\left(\sum_{k=1}^n\mathrm e^{\mathrm ikU}\right)=\Re\left(\mathrm e^{\mathrm iU}\frac{\mathrm e^{\mathrm inU}-1}{\mathrm e^{\mathrm iU}-1}\right)=\Re\left(\mathrm e^{\mathrm iU/2}\frac{\mathrm e^{\mathrm inU}-1}{2\mathrm i\sin(U/2)}\right), $$ which implies that $$ |S_n|\leqslant\left|\mathrm e^{\mathrm iU/2}\frac{\mathrm e^{\mathrm inU}-1}{2\mathrm i\sin(U/2)}\right|\leqslant\frac1{|\sin(U/2)|}. $$ Thus, $$ [U\ne0]\subseteq[|S_n|/n\to0], $$ which implies the almost sure convergence, and in particular the convergence in probability, of $S_n/n$ to $__$.