Assume the usual linear regression model with $Y = X \beta + \epsilon$, where $X$ is fixed and known and $E(\epsilon) = 0$ and $\operatorname{var}(\epsilon) = \sigma^2I$.
Let $a$ be an $n$-dimensional vector, $a\neq 0$ and let $b$ be an $m$-dimensional fixed vector and define
$$v = a^T\epsilon + b^T \beta$$
We would like to find the linear estimator $\hat{v} = LY$ of $v$ such that $E(\hat{v}) = b^T\beta$ and the mean squared error $E((\hat{v} - v)^2)$ is as small as possible.
First I prove that
$$LY = (a^T(I-H) + b^T(X^TX)^{-1}X^T)Y$$
where $H = X(X^TX)^{-1}X^T$ and it holds that $E(LY) = b^T\beta$.
How can I show that for linear estimators with the given properties holds
$$E((LY - v)^2) = E((LY - E(v\mid Y))^2) + E((E(v\mid Y) - LY)^2)$$
So that mean squared error is as small as possible?
Thx for help.