I am to solve an optimization problem as described below:
$$ \min f(x) = \frac{1}{2}\left\lVert x - x_{b} \right\rVert^{2}+ \frac{1}{2}\left\lVert \epsilon \right\rVert^{2}$$ with $$ Hx -y = \epsilon $$
If I'm to prove that the problem is convex, how do I take the gradient two times of this function? I have problem derivating when there is a norm in it. Thanks for the help
You can express the norm squared as inner products of vectors, leading to a quadratic form highlighted in blue below (I have assumed all the vectors have real components, so that for vectors $x$ and $y$, we have $x^Ty=y^Tx$) $$\begin{align}f(x) &= \frac{1}{2}\left\lVert x - x_{b} \right\rVert^{2}+ \frac{1}{2}\left\lVert \epsilon \right\rVert^{2} \\&= \frac{1}{2}( x - x_{b})^T(x-x_b)+ \frac{1}{2}(Hx-y)^T(Hx-y)\\&=\frac{1}{2}(x^Tx\color{red}{-x^Tx_b-x_b^Tx}+x_b^Tx_b)+\frac{1}{2}(x^TH^THx\color{red}{-x^TH^Ty-y^THx}+y^Ty)\\&=\frac{1}{2}(\color{blue}{x^Tx}\color{red}{-2x_b^Tx}+x_b^Tx_b)+\frac{1}{2}(\color{blue}{x^TH^THx}\color{red}{-2y^THx}+y^Ty) \\&=\frac{1}{2}\color{blue}{x^T(I+H^TH)x}-x_b^Tx+\frac{1}{2}x_b^Tx_b-y^TH^Tx+\frac{1}{2}y^Ty\end{align}$$ Using matrix calculus result $\frac{d^2}{dx^2}x^TAx=A+A^T$ , the second derivative is straightforward $$f^{''}(x)=(I+H^TH)$$ where $I$ is the identity matrix.