Converting Univariate Time Series to a Multivariate Time Series

115 Views Asked by At

$X_{t} =0.9X_{t-1}-0.7X_{t-2} + \epsilon_t$

Its clear that the above process does not exhibit the markov property, i.e the future depending on the present.

How would I rewrite the above Time Series into a multivariate process that does have the Markov property? I am pretty much baffled as to how I would execute such a manipulation.

1

There are 1 best solutions below

1
On

It sounds more complicated than it is. Ignore the error term for now. First, replace the single variable current state $X_t$ with a ordered-pair state $Y_t = (X_t, X_{t-1})$. It should be clear that you can derive the new state $Y_{t+1} = (X_{t+1}, X_t)$ by computing

$$ X_{t+1} = 0.9X_t-0.7X_{t-1} $$

where both input values are available in $Y_t$. For instance, if $Y_1 = (10, 8)$, then you compute $Y_2$ by observing that $X_2 = 0.9(10)-0.7(8) = 9-5.6 = 3.4$. You copy $X_1$ from the first element of $Y_1$ to the second element of $Y_2$, yielding $Y_2 = (3.4, 10)$. Then $Y_3 = (-3.94, 3.4)$. And so on.

That is merely the deterministic part. Now, for the error term $\epsilon_t$. The straightforward thing to do is to just compute the new state as you did before, but with the error term added in. Unfortunately, if you just do that, the error terms in the earlier values will be mixed together, and you will be unable to tell how much of $X_t$ was $X_{t-1}$ and $X_{t-2}$, and how much was $\epsilon_t$. The way around that is to commemorate previous error terms as additional components of the state $Y_t$. How many terms do you need to add in order to figure out how the previous values were computed?

ETA: I take it back. I guess in this particular instance, you do not have to make separate room for the error terms. In other problems (in particular, where the error terms exhibit some auto-correlation), you may have to, but apparently not here.