Covariance of Random Variables and limit of empirical expected value

58 Views Asked by At

Let $\{X_i\}$ be uncorrelated, identically distributed random variables with $\mathbb{E}[X_1]=0$ and $\mathbb{E}[X_1^2]=1.$ Given are $$ Y_k=\frac{1}{2} X_{k-1}+X_{k}, k\geq 0 $$ and $$ S_n=Y_1+Y_2+...+Y_n, n\geq 1. $$ (a) Calculate $\operatorname{Cov}(Y_k, Y_{k+j})$ for $k\geq 0, j\geq 1$.
(b) Show that for all $\varepsilon>0$. $$ \mathbb{P}\left(\left|\frac{S_n}{n}\right|\geq \varepsilon\right)\stackrel{n\rightarrow\infty}{\rightarrow}0. $$


I got this far: (a) We have $$ \begin{aligned} & \operatorname{Cov}(Y_k, Y_{k+j})=\mathbb{E}[Y_k Y_{k+j}]-\mathbb{E}[Y_k]\mathbb{E}[Y_{k+j}]\\ &= \mathbb{E}\left[\left(\frac{1}{2} X_{k-1}+X_{k}\right)\left(\frac{1}{2} X_{k+j-1}+X_{k+j}\right)\right]-\mathbb{E}\left[\frac{1}{2} X_{k-1}+X_{k}\right]\mathbb{E}\left[\frac{1}{2} X_{k+j-1}+X_{k+j}\right]\\ &=\mathbb{E}\left[\frac{1}{4}X_{k-1}X_{k+j-1}+\frac{1}{2}X_{k-1}X_{k+j}+\frac{1}{2} X_{k}X_{k+j-1}+X_{k}X_{k+j}\right]\\ &-\left(\frac{1}{2}\mathbb{E}\left[ X_{k-1}\right]+\mathbb{E}\left[X_{k}\right]\right)\left(\frac{1}{2}\mathbb{E}\left[ X_{k+j-1}\right]+\mathbb{E}\left[X_{k+j}\right]\right)\\ \end{aligned} $$ Because of the identical distribution, the expected values that are subtracted are equal to $0$, so that only $$ \frac{1}{4}\mathbb{E}\left[X_{k-1}X_{k+j-1}\right]+\frac{1}{2}\mathbb{E}\left[X_{k-1}X_{k+j}\right]+\frac{1}{2}\mathbb{E}\left[ X_{k}X_{k+j-1}\right]+\mathbb{E}\left[X_{k}X_{k+j}\right] $$ remain. How to continue from here?
(b) With the Chebyshev inequality we get $$ \mathbb{P}\left(\left|\frac{S_n}{n}\right|\geq \varepsilon\right)\leq\frac{\operatorname{Var}\left(\frac{S_n}{n}\right)}{\varepsilon^2}=\frac{\operatorname{Var}\left(S_n\right)}{\varepsilon^2n^2} $$ Here I am also stuck. Hints would be great, thanks in advance.

1

There are 1 best solutions below

5
On BEST ANSWER

You can see that the covariance depends on $j$, that is $$Cov(Y_k,Y_{k+j})=\frac{1}{4}\mathbb{E}\left[X_{k-1}X_{k+j-1}\right]+\frac{1}{2}\mathbb{E}\left[X_{k-1}X_{k+j}\right]+\frac{1}{2}\mathbb{E}\left[ X_{k}X_{k+j-1}\right]+\mathbb{E}\left[X_{k}X_{k+j}\right]$$ Since the random variables $X_i$ are uncorrelated, we get the following cases :

  • If $j=0$ : $$Cov(Y_k,Y_{k})=\frac{1}{4}\mathbb{E}\left[X_{k-1}X_{k-1}\right]+\frac{1}{2}\mathbb{E}\left[X_{k-1}X_{k}\right]+\frac{1}{2}\mathbb{E}\left[ X_{k}X_{k-1}\right]+\mathbb{E}\left[X_{k}X_{k}\right]$$ $$=\frac{1}{4}\mathbb{E}\left[X_{k-1}^2\right]+\mathbb{E}\left[X_{k}^2\right]=\frac{1}{4}+1=\frac{5}{4}$$

  • If $j=1$ : $$Cov(Y_k,Y_{k+1})=\frac{1}{4}\mathbb{E}[X_{k-1}]\mathbb E[X_{k}]+\frac{1}{2}\mathbb{E}[X_{k-1}]\mathbb E[X_{k+1}]+\frac{1}{2}\mathbb{E}[X_{k}X_{k}]+\mathbb{E}[X_{k}]\mathbb E[X_{k+1}]$$ $$=\frac{1}{2}\mathbb E[X_k^2]=\frac{1}{2} $$

  • If $j>1$ : $$Cov(Y_k,Y_{k+j})=\frac{1}{4}\mathbb{E}[X_{k-1}]\mathbb E[X_{k+j-1}]+\frac{1}{2}\mathbb{E}[X_{k-1}]\mathbb E[X_{k+j}]+\frac{1}{2}\mathbb{E}[X_{k}]\mathbb E[X_{k+j-1}]+\mathbb{E}[X_{k}]\mathbb E[X_{k+j}]$$ $$=0$$

We finally get $$Cov(Y_k,Y_{k+j})=\frac{5}{4}\delta_0(j)+\frac{1}{2}\delta_1(j)$$ On the other hand, $$Var(Yi)=Cov(Y_i,Y_i)=\frac{5}{4}$$ Which yields $$Var(S_n)=\sum_{i=0}^n Var(Y_i)=\frac{5n}{4}$$ Using your inequality now yields $$\mathbb P\left(\left|\frac{S_n}{n}\right|>\varepsilon\right)\leq \frac{Var(S_n)}{n^2\varepsilon^2}=\frac{5}{4n\varepsilon^2}\overset{n\rightarrow \infty}{\longrightarrow} 0$$ Finally $$\lim_{n\rightarrow \infty}\mathbb P\left(\left|\frac{S_n}{n}\right|>\varepsilon\right)=0$$