Best Linear Predictor

1.1k Views Asked by At

If $X$ has $0$ mean and second moment $\sigma^2_X<\infty$ and is observed with some error $Z$, which is White noise independent of $X$ with second moment $\sigma^2_Z$ (again finite) i.e $Y = X + Z$

How do I find the best linear predictor of $X$ and its mean-squared error based on $n$ observations of $Y_1, Y_2, ... , Y_n$ ?

If my understanding is correct, I need to find the best linear predictor of $X$ in terms of $Y_n$ ie $X =a_0(Y-Z)_n + a_1(Y-Z)_{n-1} + ... + a_n(Y-Z)_{1}$

Where would I proceed from here? Would the MSE would be $|X-[a_0(Y-Z)_n + a_1(Y-Z)_{n-1} + ... + a_n(Y-Z)_{1}]|^2$