Proof about hessian matrix and vectors

1.5k Views Asked by At

I would like some help with the following question:

Given a $ C^{2} $ function $ f: \mathbb{R}^n \rightarrow \mathbb{R} $ and $a \in \mathbb{R}^n $, a local minima point of $f$, and $v\in \mathbb{R}^n$ a vector.

Show that $$v^T \cdot H_{f}(a) \cdot v \ge 0 $$

Also show that all the eigenvalues of $H_{f}(a)$ are non-negative, where $H_f$ is the Hessian of $f$.

I was given a hint that I should use Taylor expansion for $g(t)=f(a+vt)$

Any insight about this question will be very helpful.

2

There are 2 best solutions below

3
On BEST ANSWER

Given the $C^2$ condition you are entitled to expand $f$ around this local minima (say $x_0$) as: $$f(x_0+h)-f(x_0)=\nabla f|_{x_0}\cdot h+\frac12h^TH_fh+o(\|h\|^2)$$ at least for small $h\in\Bbb R^2$. However, due to differentiability $\nabla f|_{x_0}$ vanishes, so the RHS becomes $\frac12h^TH_fh+o(\|h\|^2)$.

Note that $H_f$ is symmetric, we can find an orthogonal $Q$ such that $H_f=Q^T\Lambda Q$ in which $\Lambda$ is the eigenvalue matrix of $H_f$, so that if we let $\eta:=Qh$ $$f(x_0+h)-f(x_0)=\frac12h^TH_fh+o(\|h\|^2)=\frac12\eta^T\Lambda\eta + o(\|\eta\|^2)=\frac12\sum_1^n\lambda_i\eta_i^2 + o(\|\eta\|^2).$$ Now suppose there is some $\lambda_i$, say $\lambda_1<0$, then choose $\eta=te_1$, and we have $$f(x_0+h)-f(x_0)=\frac12\lambda_1t^2+o(t^2)<0$$ for $t$ sufficiently small, contradicting that $x_0$ is a local minimum.

0
On

Apply Taylor expansion. The gradient term will be zero because of the necessary condition. Next, argue $\lim_{t \rightarrow 0} g(t) \geq g(0)$ for any unit vector $v$.