Matrix norm inequality $\| Bx\| \geq |\lambda| \| x \|$ for a real symmetric $B$

274 Views Asked by At

For a symmetric invertible matrix $B \in \mathbb{R}^{n \times n}$ with eigenvalues $\lambda_1, ..., \lambda_n \in \mathbb{R}$, it holds that for all $x \in \mathbb{R}^{n}$ and for any $\lambda \in \lambda_1, ..., \lambda_n$, $$\|Bx\| \geq |\lambda| \; \|x\|$$ I.e. that, while denoting the absolute smallest eigenvalue as $\lambda _{s} = \min_{\lambda \in \left\{ \lambda_1, ..., \lambda_n\right\} } |\lambda|$, we have $$\|Bx\| \geq \lambda _{s} \|x\|$$

Since $B$ is symmetric, spectral theorem applies and there exists a unique orthonormal basis formed by eigenvectors $v_{1}, \dots, v_{n}$ of $B$. The spectral decomposition of $B$ is: $$B = \sum_{i=1}^{n} \lambda _{i} v_{i}v_{i}^\intercal$$ The outer products $v_{i}v_{i}^\intercal$ are the orthogonal projections onto one-dimensional $\lambda _{i}$-eigenspace.

Now, I know there's a proof:

$$\|Bx\|^{2} = \sum_{i=1}^{n} \lambda _{i}^{2} ( v_{i}^\intercal x )^{2} \geq \min_{j\in\left\{ 1,..,n \right\}} \lambda _{j}^{2}\sum_{i=1}^{n} (v_{i}^\intercal x) ^{2} = \min_{j \in \left\{ 1,..,n \right\} } \lambda _{j}^{2} \|x\|^{2}$$

But I'm lost at two points:

  1. Why does $$\|Bx\|^{2} = \sum_{i=1}^{n} \lambda _{i}^{2} ( v_{i}^\intercal x )^{2}$$ hold? When I substitute $B$ I get $$\left\lVert \left( \sum_{i=1}^{n} \lambda _{i} v_{i} v_{i}^\intercal \right) x \right\rVert ^{2} = \dots?$$ I tried to write it out, but it gets ugly and doesn't lead to the stated equivalence. Maybe I'm missing some identity which would make it simple..

  2. Why does $$ \sum_{i=1}^{n} ( v_{i}^\intercal x ) ^{2} = ||x||^{2}$$

I also looked at the linked question (Matrix norm inequality : $\| Ax\| \leq |\lambda| \|x\|$, proof verification), but I can't see why (s)he obtained $x^{*} A^{*} A x=x^{*} U^{*} \Lambda^{*} \Lambda U x$. In my (real) case I write out the decomposition as $B= Q \Lambda Q^{-1}$ so this would give $x^\intercal B^\intercal B x = x^\intercal Q \Lambda ^\intercal \Lambda Q^\intercal x $, not $x^\intercal Q^\intercal \Lambda ^\intercal \Lambda Q x$. The later would be the case if $B = Q^{-1} \Lambda Q $, not $B = Q \Lambda Q^{-1} $, but I think that $Q \Lambda Q^{-1} \not = Q^{-1} \Lambda Q$. Afterwards it's also confusing if I could just say that my orthogonal matrix is the isometry there and $y=Qx$ and $\|y\| = \|x\|$ hold.

  1. How to prove it the way as in the linked question? (only the "easy" symmetric case)
1

There are 1 best solutions below

0
On BEST ANSWER

The following might be a helpful approach to consider:

As pointed out we have: $$ B=\sum_{i=1}^{n}\lambda_{i}v_{i}v_{i}^{T} $$

Then,

$$ Bx=\sum_{i=1}^{n}\lambda_{i}v_{i}v_{i}^{T}x $$

$$ \lVert Bx \rVert^{2}= \left(Bx\right)^{T}\left(Bx\right)=\left(\sum_{i=1}^{n}\lambda_{i}v_{i}v_{i}^{T}x\right)^{T} \left(\sum_{i=1}^{n}\lambda_{i}v_{i}v_{i}^{T}x\right) $$

$$ \lVert Bx \rVert^{2}=\left(\sum_{i=1}^{n}\lambda_{i}x^{T}v_{i}v_{i}^{T}\right) \left(\sum_{i=1}^{n}\lambda_{i}v_{i}v_{i}^{T}x\right) $$

Now, due to the orthonormality of the $v_{i}$ ($v_{i}^{T}v_{j}=1$ for $i=j$, else $0$) we obtain:

$$ \lVert Bx \rVert^{2}=\sum_{i=1}^{n}\lambda_{i}^{2}\left(x^{T}v_{i}\right)^{2}=\sum_{i=1}^{n}\lambda_{i}^{2}\left(v_{i}^{T}x\right)^{2} $$

noting that $x^{T}v_{i}=v_{i}^{T}x$, since they are just scalars.

I hope this helps.