MLE of AR(2) time series model

3.3k Views Asked by At

Considering the AR(2) model $X_t=\phi_1X_{t-1}-\phi_2 X_{t-2} + \epsilon_t$, where $\{\epsilon_t\}\sim \mathcal{N}(0,\sigma^2),$ the conditional likelihood is:

$$ L(\phi_1,\phi_2,\sigma^2)=(2\pi\sigma^2)^{-\frac{T-2}{2}}\exp \left(\frac{1}{2\sigma^2} \sum_{t=1}^{T-2}(X_t-\phi_1X_{t-1}-\phi_2 X_{t-2})^2\right)$$

I solved the MLE of the variance as this is just standard computation, but how would I proceed to find that of the coefficients of the process, $\phi_1$ and $\phi_2$?

I get the log-likelihood as

$$ \ell(\phi_1,\phi_2,\sigma^2) =\frac{-(T-2)}{2}\log(2\pi\sigma^2) - \frac{1}{2\sigma^2}\sum_{t=1}^{T-2}(X_t-\phi_1X_{t-1}-\phi_2 X_{t-2})^2$$

which makes maximising difficult.

1

There are 1 best solutions below

0
On BEST ANSWER

The conditional MLE for an auto-regressive model can be solved using standard linear regression theory. It is easy to do this in matrix form and use the standard formula for the coefficient estimator in this form. Nevertheless, it is possible to solve the maximisation problem in scalar form, and I'll show you how to do this.

Your summation in this conditional-likelihood is written incorrectly. I am presuming that you have observed time values $t = 1, 2, ..., T$, in which case you are conditioning on the values $t = 1, 2$ and the summation goes from $t = 3, ..., T$. It is also simpler if you write your equations in terms of the precision parameter $\lambda \equiv \sigma^{-2}$. With these changes, and further simplification from removing additive constants, the conditional-log-likelihood function can be written as:

$$\ell (\phi_1, \phi_2, \lambda) = \frac{T-2}{2} \cdot \ln (\lambda) - \frac{\lambda}{2} \sum_{t=3}^T (x_t - \phi_1 x_{t-1} - \phi_2 x_{t-2})^2.$$

To facilitate our analysis we define the sample quantities $s_{i,j} \equiv \sum_{t=3}^T x_{t-i} x_{t-j}$ for $i,j = 0, 1, 2$. Using this notation the score functions are:

$$\begin{equation} \begin{aligned} \frac{\partial \ell}{\partial \lambda} (\phi_1, \phi_2, \lambda) &= \frac{T-2}{2 \lambda} - \frac{1}{2} \sum_{t=3}^T (x_t - \phi_1 x_{t-1} - \phi_2 x_{t-2})^2, \\[8pt] \frac{\partial \ell}{\partial \phi_1} (\phi_1, \phi_2, \lambda) &= \frac{\lambda}{2} ( s_{0,1} - \phi_1 s_{1,1} - \phi_2 s_{1,2} ), \\[8pt] \frac{\partial \ell}{\partial \phi_2} (\phi_1, \phi_2, \lambda) &= \frac{\lambda}{2} ( s_{0,2} - \phi_1 s_{1,2} - \phi_2 s_{2,2} ). \end{aligned} \end{equation}$$


Auto-regression coefficients: The conditional-MLEs for the auto-regressive coefficients solve the simultaneous equations:

$$\begin{equation} \begin{aligned} \hat{\phi}_1 s_{1,1} + \hat{\phi}_2 s_{1,2} &= s_{0,1}, \\[8pt] \hat{\phi}_1 s_{1,2} + \hat{\phi}_2 s_{2,2} &= s_{0,2}. \end{aligned} \end{equation}$$

Solving these equations yields:

$$\hat{\phi}_1 = \frac{s_{0,1} s_{2,2} - s_{0,2} s_{1,2}}{s_{1,1} s_{2,2} - s_{1,2}^2} \quad \quad \quad \hat{\phi}_2 = \frac{s_{0,1} s_{1,1} - s_{0,2} s_{1,2}}{s_{1,1} s_{2,2} - s_{1,2}^2}.$$


Variance/precision parameter: The conditional-MLE for the variance/precision is obtained by setting the first of the score equations to zero and substituting the estimators for the auto-regressive coefficients. It is given by:

$$\hat{\sigma}^2 = \frac{1}{\hat{\lambda}} = \frac{1}{T-2} \sum_{t=3}^T (x_t - \hat{\phi}_1 x_{t-1} - \hat{\phi}_2 x_{t-2})^2.$$

This is a biased estimator, and Bessel's correction is usually applied to correct this, by substituting $T-3$ in the denominator instead of $T-2$.