Proving that $\min\limits_{x \neq 0} \frac{\|Ax\|_2}{\|x\|_2}=\sigma_n$

159 Views Asked by At

For $A \in \mathbb R^{m \times n}, m \geq n, x \in \mathbb R^{n \times 1}$, how to prove $\min\limits_{x \neq 0} \frac{\|Ax\|_2}{\|x\|_2}=\sigma_n$? As I'm new to SVD, can anyone help me in solving this problem?

1

There are 1 best solutions below

0
On

The most direct way to prove it, without any "educated guesses", is to solve the constrained optimization problem via Lagrange Multipliers. First, we rewrite the problem in standard form:

$$ \min_x \underbrace{\|Ax\|_2}_{f(x)} \qquad \text{s.t.}\qquad \underbrace{\|x\|_2-1}_{g(x)}=0$$

To make our life easier, we will consider the squared version $f(x)=\|Ax\|^2_2$, $g(x) = \|x\|_2^2-1$, which is equivalent (though you can also work with the original equation). Then the Lagrangian is:

$$\mathcal L(x, \lambda) = \|Ax\|_2^2 - \lambda (\|x\|_2^2-1)$$

And the first order necessary conditions imply

$$0\overset{!}{=}\frac{\partial}{\partial x} \mathcal L(x, \lambda) = 2A^T Ax-2\lambda x \iff A^T A x= \lambda x$$

So the optimum is obtained for an eigenvector corresponding to an eigenvalue of $A^TA$. I think you can finish from here.