What is the relation between convexity and positive semi-definite Hessian Matrix?

1.1k Views Asked by At

While studying maximum likelihood estimation for linear regression, I followed all the contents and mathematical steps and understand the way steps were derived. At the end of the context, the book says we take the second derivative to see if the Hessian Matrix of the result of the second derivative is the minimum, so we need the Hessian matrix to be positive semi-definite to guarantee a unique minimum. But I don't understand the meaning of positive semi-definite to be the minimum. And it also says convexity, but I am not really getting it. It seems like the reason I ask is that I cannot match the picture between the Hessian positive semi-definite matrix and the convexity.

Hope to hear some intuitive explanations considering I am not a real math person but trying to learn more.

1

There are 1 best solutions below

4
On BEST ANSWER

A real-valued twice differentiable function $f:\mathbb{R}^n \rightarrow \mathbb{R}$, defined on a convex set $D \subseteq \mathbb{R}^n$, is convex if and only if its Hessian matrix $Hf(\mathbf{x})$ is positive semi-definite for all $\mathbf{x}$ in $D$. If a function is convex, it has a unique global minimum (and the set of its minimizer(s) is convex).

It may be useful to know that, if $\mathbf{x}_0$ is a stationary point for $f$, a sufficient condition for its being a local minimum is that $Hf(\mathbf{x}_0)$ is positive definite.