I am trying to solve the next problem:
Let the time series $X_n, n ∈ \mathbb Z$ be a $WN(0, σ^2)$. Find an optimal (in mean square sense) predictor for $X_{n+1}$ if you can observe: 1) $X_n$, 2) $X_{n−1}$, 3) $X_1$. Find mean square error of $\tilde X_{n+1}$.
I know that optimal function to prediction is equal to $E[Y|X]$. Optimal means the lowest MSE: $E[Y-\tilde Y]^2$. But I really don't understand from where to start.
I am really thankful for any suggestions and ideas.
I think I solve this problem and it should be done like this:
$E[X_{n+1}|X_n] = E[\epsilon_{n+1}|\epsilon_n] = E[\epsilon_{n+1}] = 0$
Because by definition of white noise future values are uncorrelated with previous. And the expected value of white noise is equal to zero.
$MSE = E[0-\epsilon_{n+1}] = \sigma^2$
The same logic for the rest: $X_{n-1}$ and $X_1$.