Connection Positive Semi Definite matrices and eigenvalues

40 Views Asked by At

During my math class my professor described the following situation in context of gradient descent methods:

Define $S = \{ x \in R^n : f(x) \leq f(x_0)\}$ and assume $f(x)$ is strongly convex in $S$. There is a contstant $m > 0$ with $\triangledown^2 f(x) \succcurlyeq mI$. He said something along the lines of "this means the smallest eigenvalue is larger than m". Can somebody explain why this is the case?

The curly braces mean that the less or equal is PSD in this context, but I don't see how that translates to eigenvalues. I don't see how $\forall v \in S, v^T (\triangledown^2 f(x) - mI )v \geq 0 $ means that the eigenvalues are larger than $m$. Any help would be appreciated. Thanks!

1

There are 1 best solutions below

3
On BEST ANSWER

I think the precise statement should be that if the eigenvalues are at least $m$ then your inequality holds. To frame it more generally, you want to show if $A$ is an $n\times n$ real symmetric matrix and the smallest eigenvalue of $A$ is at least $m$, then $v'(A-mI)v\geq 0\quad\forall v\in\mathbb R^n$.

By the spectral theorem, symmetry of $A$ implies it is orthogonally diagonalizable, and hence has a spectral decomposition $A=Q'\Lambda Q$ where $Q'Q=I,\Lambda=\text{diag}(\lambda_1,...,\lambda_n)$ and $\lambda_j$ are the eigenvalues of $A$. Without loss, let's assume $\lambda_1\leq ...\leq \lambda_n$. Then we have $\forall v\in\mathbb R^n$

$$ v'Av=(Qv)'\Lambda Qv=\sum_{j=1}^n \lambda_j [(Qv)_j]^2\geq \lambda_1\sum_{j=1}^n [(Qv)_j]^2=\lambda_1 (Qv)'Qv=\lambda_1 v'v\geq mv'v.$$