Let $(U,V)$ be joint random variables (assume zero means for simplicity). It is well known that \begin{align} \min_{a,b} E[(V -(aU+b))^2], \end{align} is minimized by $a= \frac{E[VU]}{E[U^2]}$ and $b=0$. This answer results in what is know as the best linear estimator.
The classical proof of this expands the expression and finds partial derivatives with respect to $a$ and $b$.
My question: Are there alternative proofs that do not use differentiation? For example, can this be found via some known inequalities like Jensen or Cauchy-Schwarz? In other words, I am looking for interesting and unique solutions to this old problem.
Edit I didn't ask this originality, but can it be also done with minimal expansion.
Expanding, \begin{align*} \mathbb{E}(V-(aU+b))^2 &= \mathbb{E}V^2 - 2 (a \mathbb{E} UV + b \mathbb{E} V) + \mathbb{E}(aU+b)^2 \\ &= (a^2 \mathbb{E}U^2 - 2a \mathbb{E} UV) + b^2 + \mathbb{E}V^2 \end{align*} Note that these are separable quadratics in $a, b$ (since the cross term $ab$ vanishes because we assumed $\mathbb{E} U = 0$), and each is minimized at the vertex, given by $x = -\gamma_1/2\gamma_2$ in $\gamma_2 x^2 + \gamma_1 x + \gamma_0 = 0$. Hence, \begin{align*} a &= \frac{2 \mathbb{E} UV}{2 \mathbb{E}U^2} = \frac{\mathbb{E} UV}{\mathbb{E}U^2} \\ b &= -\frac{0}{2\cdot 1} = 0 \end{align*}