Are the following statements true regarding the singular values of a real matrix?

66 Views Asked by At

Let $A\in \mathbb{R}^{m\times k}\ (k<m)$ be a real matrix. Suppose, $\forall x \in \mathbb{R}^{k}$, and for a $\delta \in (0,1)$, the following inequality holds: \begin{equation} (1-\delta) \|x\|_2^2 \leq \|Ax\|_2^2 \leq (1+\delta)\|x\|_2^2.\ \ \ \ (1). \end{equation} Let, $\sigma_{max}$, and $\sigma_{min}$ denote the largest and smallest singular values of the matrix $A$, respectively.

Then I have the following statements:

(1). Rank of the matrix $A$ is $k$.

(2). $\sigma_{min}\geq \sqrt{1-\delta}$.

(3). $\sigma_{max}\leq \sqrt{1+\delta}$.

I want to know if all the above three statements are true. Can anybody confirm that all the above three statements are true?

~ As Robert Israel asked me, I am going to explain what I have tried: ~

The $\sigma_{max}$ of a matrix $A\in\mathbb{R}^{m\times k}$ is defined as: \begin{equation} \sigma_{max} = \max_{x\in\mathbb{R}^{k}} \frac{\|Ax\|_2}{\|x\|_2} \end{equation} Now, suppose that the above expression is maximized by the vector $\hat{x}$. Then for an arbitrary vector $x\in\mathbb{R}^{k}$, the following inequality holds: \begin{equation} \sigma_{max} \geq \frac{\|Ax\|_2}{\|x\|_2}. \end{equation} The above relationship can be written as: \begin{equation} \sigma_{max}^2 \|x\|_2^2\geq \|Ax\|_2^2. \end{equation}

Now, on comparing the above inequality with the inequality given in (1), we can say that $\sigma_{max}\leq \sqrt{1+\delta}$. This proves that my statement number (3) is right.

Now, I am stuck on how I can prove that: \begin{equation} \sigma_{min}= \min_{x\in\mathbb{R}^k, x\neq 0} \frac{\|Ax\|_2}{\|x\|_2}. \end{equation} If I am able to prove the above relationship, I will be able to prove my statements number (1) and (2).

Any help would be very much appreciated!

1

There are 1 best solutions below

0
On

The singular values are by definition the square roots of the nonzero eigenvalues of $A^T A$. $A^T A$ is a real symmetric matrix, so its eigenvalues are real and the corresponding eigenvectors may be chosen to be real. Note that for any $x \in \mathbb R^k$, $x^T A^T A x = \|Ax\|^2$, and so if $\sigma^2$ is an eigenvalue of $A^T A$ corresponding to eigenvector $x$, $ \|A x\|^2 = x^T (\sigma^2 x) = \sigma^2 \|x\|^2$. Thus your inequality (1) implies $1 - \delta \le \sigma^2 \le 1 + \delta$.