law of large numbers for AR model.

287 Views Asked by At

Suppose we have an AR(1) model : $$ x_k = \theta x_{k-1} + u_k$$ Where $(u_k)$ is a centered process with $\sigma^2$ variance, and $|\theta| \lt 1$. We can easily prove that $$\mathbb{E}(x_k x_{k+1}) = \frac{\sigma^2\theta}{1-\theta^2}$$ How can i prove that $$\frac{1}{n} \sum x_k x_{k+1}$$ converges in probability to $\mathbb{E}(x_k x_{k+1})$ since the $(x_k x_{k+1})$ are not independent?

1

There are 1 best solutions below

0
On

Independence is not necessary for the weak law of large numbers.

For a sequence of correlated random variables $X_i$, if the covariance $r(j) = Cov(X_k,X_{k+j})$ goes to zero as $j \to \infty$, then the weak law of large numbers still holds.

To see this, suppose $X_1, X_2, \dots$ are a sequence of correlated random variables such that $r(j)=Cov(X_k,X_{k+j})\to 0$ as $j \to \infty$.

Then for a given $\epsilon > 0$, let $R_\epsilon$ be such that $r(n)<\epsilon$ for $n>R_\epsilon$. The existence of such an $R_\epsilon$ is guaranteed by the assumption $r(j)\to 0$. Then we have:

$$P(|\bar{X}_n - E(X)|>\delta)\leq Var(\sum_{i=1}^nX_i)/n^2\delta^2$$

$$=\frac{\sum_{i,j=1}^n Cov(X_i,X_j)}{n^2\delta^2}$$

$$\leq \frac{n^2 \epsilon + 2nR_\epsilon r(0)}{n^2 \delta^2}$$

$$\to \frac{\epsilon}{\delta^2}$$

Since $\epsilon$ is arbitary, it follows that $P(|\bar{X}_n - E(X)|>\delta) \to 0$ in probability, even though the $X_i$ are correlated.

To apply this line of reasoning to your situtation, you need to control for the correlation and show that it goes to zero as the number of lags $j$ increases. That is, you need to show that

$$r(j)= Cov(X_iX_{i+1}, X_{i+j}X_{i+j+1}) \to 0 \quad \text{ as } \quad j \to \infty$$