In my textbook there is this exercise in linear least squares chapter:
Let $A$ be a given $n\times n$ matrix and $v$ a specified $n$-vector. Find the value $\lambda^*$ that minimizes $\|Av-\lambda v\|_2^2$ with respect to $\lambda$. If the optimal residual happens to be zero, what does this imply about $b$ and $\lambda$?
For the second part of the exercise, I think $b$ and $\lambda$ need to be eigenvector and eigenvalue of $A$, so the residual can be zero.
For first part, I tried
$\rho=Av-\lambda v= \rho_R + \rho_N = Ab-\lambda b_R-\lambda b_N$
$\rho_R=Ab-\lambda b_R, \rho_N=-\lambda b_N$
$\|\rho\|_2^2=\|\rho_R\|_2^2+\|\rho_N\|_2^2=\|Ab-\lambda b_R\|_2^2 + \|-\lambda b_N\|_2^2= \|Ab-\lambda b_R\|_2^2 + \lambda^2 \|b_N\|_2^2$
I don't know how to continue.
If $v$ is an eigenvector and $\lambda$ is its corresponding eigenvalue, $Av - \lambda v = \lambda v - \lambda v =0$. Likewise, if $v$ is not an eigenvector, then $(A - \lambda I) v \neq 0$ for any choice of $\lambda$ and therefore the optimal value is not zero.
As for minimizing the desired quantity with respect to $ \lambda$ (assuming our vector space is real; the complex case is similar), note that
$$\lVert A v - \lambda v \rVert^2 = \langle A v - \lambda v , A v - \lambda v \rangle = \langle Av, Av \rangle - 2 \lambda \langle v, Av \rangle + \lambda^2 \langle v, v \rangle$$
Since
$$\langle Av, Av \rangle - 2 \lambda \langle v, Av \rangle + \lambda^2 \langle v, v \rangle$$
is quadratic with respect to $\lambda$ and it opens upward ($\langle v,v \rangle \geq 0$), we can minimize it by finding its vertex (alternatively, take its derivative with respect to $\lambda$ and set it to zero; this is indeed the minimizer by convexity/first derivative test/second derivative test, take your pick). Using the derivative approach, the minimizer $\lambda^*$ satisfies $-2 \langle v,Av \rangle + 2 \lambda^* \langle v,v \rangle = 0$, for which you can solve for $\lambda^*$.