I have the following autoregressive process $$X_t = 5X_{t-1} - 6X_{t-2} + Z_t \tag{1}$$ where $(Z_t)_t$ is a sequence of independent standard normal random variables.
Can I then find a stationary sequence $(\hat{X}_t)_t$ that satisfies $(1)$ and $\hat{X}_0 = 0$ almost surely?
I know that the unique stationary solution to $(1)$ is $X_t = \sum_{j\in\mathbb{Z}}\psi_jZ_{t-j}$ where $(\psi_j)_j$ are the coefficients of the Laurent series obtained by inverting the polynomial defining the AR process in $(1)$. By this mean $\sum_j\psi_jz^j = \frac{1}{1-5z+6z^2}$.
So then, the question boils down to whether $\sum_{j\in\mathbb{Z}}\psi_jZ_{-j} = 0$ almost surely. I first consider the partial sums $$S_n = \sum_{\lvert j\rvert \leq n}\psi_jZ_{-j} \qquad S = \sum_{j\in\mathbb{Z}}\psi_jZ_{-j}$$ Since $(Z_{j})$ is an independent sequence of standard normals, each $S_n$ is normal. I know that the infinite sum is well-defined ($\psi_j$'s are absolutely summable). So then, $S_n \to S$ almost surely, hence also in distribution. I know that if a sequence of normal random variables $(X_n)$ converges to $X$ then $X$ is either normal or degenerate. So I cannot immediately rule out the possibility that $S = 0$ almost surely. How do I proceed here?
I just realized that $S$ can only be degenerate if $Var(S_n) \to 0$. But $Var(S_n) = \sum_{\lvert j\rvert \leq n}\lvert \psi_j\rvert^2$ and $\sum_{ j}\lvert \psi_j\rvert^2 = 0$ if and only if $\psi_j = 0$ for each $j$, which is not the case here. Is this correct?
The OP clarified that the index can take on negative values, so $Y_0$ is an intermediate element of the process.
If $Y_0 = 0$ then, to satisfy also
$$Y_t = 5Y_{t-1} - 6Y_{t-2} + Z_t \tag{1} $$
we must have
$$Y_0 = 5Y_{-1} - 6Y_{-2} + Z_0= 0 \implies Y_{-1} = \frac 6 5Y_{-2}-\frac 1 5Z_0$$
$$Y_1 = -6Y_{-1}+Z_1$$ $$Y_2 = 5Y_1 +Z_2$$
Combining
$$Y_2 = 5\Big(-6\big[\frac 6 5Y_{-2}-\frac 1 5Z_0\big]+Z_1\Big) +Z_2$$
$$\implies Y_2^* = -36Y_{-2} +6Z_0 + 5Z_1+Z_2 \tag{2}$$
Note that by construction $(2)$ satisfies the general relation $(1)$. And the question becomes: Is the process (that satisfies $(1)$)
$$\{...,Y_{-2},Y_{-1}, 0, Y_1, Y_2^*,Y_3,...\}$$
(with $Y_2^*$ determined in $(2)$)
stationary? Well, it is certainly mean-stationary (with mean equal to zero).
Is it also covariance-stationary? But this should have been obvious from the beginning, since, if an intermediate element of the process has to be a constant almost surely (zero or non-zero), then this element will have a zero variance (a.s), while the other elements have strictly positive variance. So the process is not covariance-stationary, since it does not satisfy the stationarity condition for the autocovariance function (which requires common variance throughout).