Autoregressive model: $X_t=c+\sum_{i=1}^p \varphi_i X_{t-i} +\varepsilon_t$: when is the constant $c$ relevant?

88 Views Asked by At

I'm new to time series, so this might be a naive question (or worse, but hopefully not).

The autoregressive model $\mathrm{AR}(p)$ is defined by the equation $$X_t=c+\sum_{i=1}^p \varphi_i X_{t-i} +\varepsilon_t.$$ Where, if I understand correctly, the $\varphi_i$s are parameters that can be derived from prior data (autocorrelation), and $c+\sum_{i=1}^p \varphi_i X_{t-i}$ is the estimate for $X_t$, and $\varepsilon_t$ is the error of this estimate.

But, what is $c$? Why is it there? When is it used?

It's described as a "constant", but this doesn't explain why we need this constant.

Question: When is $c$ relevant? What is an example when omitting $c$ is desirable, and an example when it is not desirable?

(NB. Images here and on Wikipedia (including equations) do not work where I am currently.)

1

There are 1 best solutions below

3
On BEST ANSWER

If the mean of the process is non-zero, you need the constant term:

Consider a zero-mean process:

$X_t=\sum_{i=1}^p \varphi_i X_{t-i} +\varepsilon_t$

(you can check that the unconditional expectation is $0$ when the series is stationary - e.g. by applying basic properties of expectation)

Now consider another series where $Y_t=X_t+\mu$ (that is $Y$ is just like the zero-mean $X's$ but shifted up by $\mu$).

$(Y_t-\mu)=\sum_{i=1}^p \varphi_i (Y_{t-i}-\mu) +\varepsilon_t$

If you rearrange and collect the terms in $\mu$, you get:

$Y_t=c+\sum_{i=1}^p \varphi_i Y_{t-i} +\varepsilon_t$

where $c= \mu(1-\sum_{i=1}^p \varphi_i)$.

So in short: it lets you use AR models to model processes whose mean is non-zero.