Relevant Information. Let $X(t)$, $t \in T$ be a second order process. Let $M_0$ be the set of random variables of the form $a + b_1X(s_1)+ \cdots + b_nX(s_n)$ for a positive integer $n$ and constants $a, b_1,\ldots, b_n$. Let $M$ be the set of random variables that are mean square limits of the elements of $M$. $M$ is closed under addition, scalar multiplication, and mean square convergence. Please comment if you need more clarification regarding the set $M$.
Problem. Let $\hat{Y} \in M$ satisfy $\mathbf{E}(\hat{Y}-Y)=0$ and $\mathbf{E}[(\hat{Y}-Y)X(t)]=0$. Show that $\hat{Y}$ is an optimal linear estimator of $Y$.
To start, I think that the following theorem would be relevant:
Theorem: $\hat{Y} \in M$ is an optimal linear estimator of $Y$ if and only if $\mathbf{E}[(\hat{Y}-Y)Z]=0, Z\in M$
Could someone give me an idea of how to start this problem? I noticed that it follows from the given assumptions that $\mathbf{E}[(\hat{Y}-Y)Z]=0, Z \in M_0$, where we have made $Z$ take on the form $a + b_1X(s_1)+ \cdots + b_nX(s_n)$. I'm not sure how to get from proving this for $Z \in M_0$ to the extension of $Z \in M$.