Show that every rank-deficient matrix has a full rank matrix arbitrarily close to it

341 Views Asked by At

I came across the following exercise in preparation for a test.

Let $A \in \mathbb{R}^{n \times m}$ with rank $r < \min\{n,m\}$. Use SVD of $A$ to show that for every $\epsilon >0$ (no matter how small), there existsa full-rank matrix $A_{\epsilon} \in \mathbb{R}^{n\times m}$ such that $|| A - A_\epsilon ||_{2} < \epsilon$.

I am really puzzled on how to approach this exercise. From my understanding $$ A - A_\epsilon = \underset{n\times n}{U} \ \underset{n \times m}{\Sigma} \ \underset{m\times m}{V^{T}} - \underset{n\times n}{U} \ \underset{n \times m}{\Sigma_\epsilon} \ \underset{m\times m}{V^{T}} = U(\Sigma - \Sigma_\epsilon)V^{T}.$$ Then $\Sigma$ is diagonal matrix with singular values ranging from $\sigma_1$ to $\sigma_{\min\{n,m\}}$ and $\Sigma_\epsilon$ having singular values from $\sigma_1$ to $\sigma_m$ since $A_\epsilon$ is of rank $m$.

Then since $\min\{n,m\} \le m$ $$ \Sigma - \Sigma_\epsilon = \begin{pmatrix} \sigma_1 & & & \\ & \ddots & \\ & & \sigma_n & \\ & & & 0\end{pmatrix} - \begin{pmatrix} \sigma_1 & & & \\ & \ddots & \\ & & \sigma_{\min\{n,m\}} & \\ & & & 0\end{pmatrix} = \begin{pmatrix} 0& & & \\ & \ddots & \\ & & \sigma_? & \\ & & & 0\end{pmatrix}$$ So since a singular value exists for the matrix difference above, $|| A - A_\epsilon ||_2 = \sigma_?$ which has to be small since $\sigma_1 > \sigma_2 > \dots > \sigma_r$, and here this $\sigma_?$ is among among the last in the sequence.

I am most likely writing non-sense, I am aware of that, I don't know how to proceed, can anyone give me some hint or idea ?

2

There are 2 best solutions below

3
On BEST ANSWER

You are very close. The proof needs a couple of ingredients:

  • What is the norm of A (say the 2-norm, since they are all equivalent) in terms of $\Sigma$? And what's the norm of $\Sigma$ in terms of the $\sigma_i$'s?
  • The simples way to perturb $\Sigma$ is to let $\Sigma_\varepsilon$ have the first $r$ singluar values matching those of $\Sigma$.

If you write $\Sigma_\varepsilon$ as described above, what would $\|\Sigma_\varepsilon-\Sigma\|$ be? And therefore, what aboud $\|A-A_\varepsilon\|$?

Edit: read further for more details.

You are free to build $A_\varepsilon$ in whatever way you want. The goal is to produce one matrix that is close to A, however peculiar its construction may be. So pick same $A_\varepsilon=U\Sigma_\varepsilon V^T$, with $\Sigma_\varepsilon$ with the same first $r$ singular values as $\Sigma$, and $\sigma_i<\varepsilon$ for $i=r+1,...,p$. (Note: you could pick any values for $\sigma_{r+1},...,\sigma_p$, so long as they are all positive, and all smaller than $\varepsilon$. Then $\|\Sigma-\Sigma_\varepsilon\|=\sigma_{r+1}<\varepsilon$. This proves that $A_\varepsilon$ is arbitrarily close to A. The fact that all its singular values are non zero (small as they may be), implies that $A_\varepsilon$ is full rank. QED.

0
On

Note that in the case of square matrices $(n\times n$), there is maybe a shorter proof.

$p(x)=\det(A-xI)$ is a polynomial in $x$ and has at most $n$ roots in $\mathbb C$.

So let's take $A_\epsilon=A-\epsilon I$, we have $\det(A_\epsilon)=\det(A-\epsilon I)=p(\epsilon)$

The number of roots of $p$ being finite, we can always choose $\epsilon$ such that $p(\epsilon)\neq 0$, making $A_\epsilon$ invertible.