How to Simplify Yule Walker Expression for AR Coefficients?

210 Views Asked by At

We have an AR(2) process (w/ intercept omitted): $$ y _ { t } = a _ { 1 } y _ { t - 1 } + a _ { 2 } y _ { t - 2 } + \varepsilon _ { t } $$ We multiply the second order equation by $ y_{t-s} \text{ for } s = 0, s = 1, s = 2 \ldots $ $$ \begin{aligned} E y _ { t } y _ { t } & = a _ { 1 } E y _ { t - 1 } y _ { t } + a _ { 2 } E y _ { t - 2 } y _ { t } + E \varepsilon _ { t } y _ { t } \\ E y _ { t } y _ { t - 1 } & = a _ { 1 } E y _ { t - 1 } y _ { t - 1 } + a _ { 2 } E y _ { t - 2 } y _ { t - 1 } + E \varepsilon _ { t } y _ { t - 1 } \\ E y _ { t } y _ { t - 2 } & = a _ { 1 } E y _ { t - 1 } y _ { t - 2 } + a _ { 2 } E y _ { t - 2 } y _ { t - 2 } + E \varepsilon _ { t } y _ { t - 2 } \\ \cdots & \\ & \cdots \\ E y _ { t } y _ { t - s } & = a _ { 1 } E y _ { t - 1 } y _ { t - s } + a _ { 2 } E y _ { t - 2 } y _ { t - s } + E \varepsilon _ { t } y _ { t - s } \end{aligned} $$

By definition, the autocovariances of a stationary series are such that $E y _ { t } y _ { t - s } =$ $E y _ { t - s } y _ { t } = E y _ { t - k } y _ { t - k - s } = \gamma _ { s } .$ We also know that $E \varepsilon _ { t } y _ { t } = \sigma ^ { 2 }$ and $E \varepsilon _ { t } y _ { t - s } = 0 .$ Hence, we can use the equations in $( 2.24 )$ to form

$$ \gamma _ { 0 } = a _ { 1 } \gamma _ { 1 } + a _ { 2 } \gamma _ { 2 } + \sigma ^ { 2 } $$ $$ \begin{aligned} \gamma _ { 1 } & = a _ { 1 } \gamma _ { 0 } + a _ { 2 } \gamma _ { 1 } \\ \gamma _ { s } & = a _ { 1 } \gamma _ { s - 1 } + a _ { 2 } \gamma _ { s - 2 } \\ \text { Dividing by } \gamma _ { 0 } \text { yields } \\ \rho _ { 1 } & = a _ { 1 } \rho _ { 0 } + a _ { 2 } \rho _ { 1 } \\ \rho _ { s } & = a _ { 1 } \rho _ { s - 1 } + a _ { 2 } \rho _ { s - 2 } \end{aligned} $$

Question: How did $ \gamma_1 / \gamma_0 $ turn into the $\rho_1$ function?

The simple algebra of this division isn't making sense. What happens to the Variance $ \sigma^2 $ that's in $ \gamma_0 $ but somehow eliminated in the $ \gamma_1$ function?

1

There are 1 best solutions below

9
On BEST ANSWER

Suppose we have an AR$(2)$ process

$$X_t=\frac{1}{3}X_{t-1}+\frac{1}{2}X_{t-2}+Z_t$$

From properties of the lag operator:

$$L(B)=1-\frac{1}{3}B-\frac{1}{2}B^2$$

This has real roots, both are outside the unit circle. This implies this process is weakly stationary.

Remark: A stochastic process is said to be weakly stationary when its mean and autocovariance does not vary with respect to time. Also, the second moment is finite at all times. For more details see this

Next multiply both sides by $X_{t-k}$ and take expectation to obtain $E(X_{t-k}X_t)=\gamma(k)$. (now the part you are interested in). Since $\mu=0$, assuming $\gamma(k)=0$, then:

$$\gamma(-k)=\frac{1}{3}\gamma(-k+1)+\frac{1}{2}\gamma(-k+2)$$

Here note $\gamma(k)=\gamma(-k)$ for any $k$.

$$\gamma(k)=\frac{1}{3}\gamma(k-1)+\frac{1}{2}\gamma(k-2)$$

Dividing both sides by $\gamma(0)=\sigma^2$ gives:

$$\rho(k)=\frac{1}{3}\rho(k-1)+\frac{1}{2}\rho(k-2)$$

Notice $Var(X)=(E(X))^2-E(XY)$. See the above assumption and for stationary time series $E(X)=\mu=0$. The definition of autocorrelation is given here.

The example is from the course by the state university of New York