The smallest non-zero singular value of AB

1.4k Views Asked by At

For a matrix $A$, let $σ_{min}(A)$ denote the smallest $\textbf{non-zero}$ singular value of A.

I saw some materials (e.g. https://pdfs.semanticscholar.org/e74f/89ac85239811b295787619e73c99fc867724.pdf) says that $$σ_{i}(AB)≥σ_{i}(A)σ_{n}(B),~~~ i=1,\cdots,n$$

Here $A$ and $B$ are rectangle matrix with dimension, say, $p\times n$ and $n\times m$. This holds for even zero singular value. So if $B$ is not full rank, then $\sigma_n(B)$ is $0$ and the above bound is trivial.

My question is if the following inequality holds: $$σ_{min}(AB)≥σ_{min}(A)σ_{min}(B)$$

Thanks.

2

There are 2 best solutions below

3
On BEST ANSWER

In general it isn't true. E.g. when $A$ is a nonzero row vector and $B$ is a nonzero column vector such that $AB=0$, both $\sigma_\min(A)$ and $\sigma_\min(B)$ are positive but $AB$ has not any nonzero singular value.

The inequality holds, however, when $A$ has full column rank and $B\ne0$ or when $A\ne0$ and $B$ has full row rank. Since the latter case reduces to the former when we consider $(AB)^T$, we shall only deal with the first case. In this case, $Bx=0$ or $ABx=0$ if and only if $x\in\operatorname{range}(B^\ast)^\perp$, and whenever $ABx\ne0$, we have $$ \|ABx\| =\left\|A\frac{Bx}{\|Bx\|}\right\|\|Bx\| =\|Ay\|\|Bx\| $$ for some unit vector $y$. Therefore, by Courant-Fischer minimax principle,

$$ \sigma_\min(AB) =\min_{\|x\|=1,\,x\in\operatorname{range}(B^\ast)}\|ABx\| \ge \min_{\|y\|=1} \left\|Ay\right\| \min_{\|x\|=1,\,x\in\operatorname{range}(B^\ast)}\|Bx\| = \sigma_\min(A)\sigma_\min(B). $$

0
On

Let $A \in \mathbb{R}^{m \times n}$ and $B \in \mathbb{R}^{n \times p}$. Let $\sigma_*(A)$ be the smallest non-zero singular value of $A$ if $A \neq 0$, and zero if $A = 0$. I use $\sigma_*$ to avoid confusion with the smallest singular value.

Counter-example for a general lower bound

Let $A = \lceil 1, 0 \rfloor$ (a diagonal matrix). Then $\sigma_*(A) = 1$.

  • Let $B = \lceil 1, 2 \rfloor$. Then $AB = \lceil 1, 0 \rfloor$, and $1 = \sigma_*(A) \sigma_*(B) = \sigma_*(AB) = 1$.
  • Let $B = \lceil 2, 1 \rfloor$. Then $AB = \lceil 2, 0 \rfloor$, and $1 = \sigma_*(A) \sigma_*(B) < \sigma_*(AB) = 2$.
  • Let $B = \begin{bmatrix} 0 & 1 \\ 0 & 1 \end{bmatrix}$. Then $AB = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}$, and $\sqrt{2} = \sigma_*(A) \sigma_*(B) > \sigma_*(AB) = 1$.

Hence a general lower (or upper) bound which works for every matrix does not exist. Additional assumptions are needed.

Relaxed sufficient conditions

The accepted answer by user1551 gives sufficient conditions for the stated lower bound to hold: $A$ is column-independent and $B \neq 0$, or $B$ is row-independent and $A \neq 0$.

I'll note that the lower bound holds with the following relaxed condition:

$$\mathrm{range}(BB^T) \subset \mathrm{range}(A^T) \lor \mathrm{range}(A^T A) \subset \mathrm{range}(B).$$

Note that:

  • when $A$ is column-independent, $\mathrm{range}(A^T) = \mathbb{R}^n$,
  • when $B$ is row-independent, $\mathrm{range}(B) = \mathbb{R}^n$,
  • $\mathrm{range}(A^T) = \mathrm{null}(A)^{\perp}$,
  • $\mathrm{range}((AB)^T) = \mathrm{range}(B^T A^T) \subset \mathrm{range}(B^T)$,
  • $\mathrm{null}(B) \cap \mathrm{range}(B^T) = \mathrm{null}(B) \cap \mathrm{null}(B)^{\perp} = \{0\}$.

Suppose $A = 0$ or $B = 0$. Then $\sigma_*(AB) = \sigma_*(0) = 0 = \sigma_*(A) \sigma_*(B)$, and the lower bound holds.

Suppose $A \neq 0$, $B \neq 0$, and $\mathrm{range}(BB^T) \subset \mathrm{range}(A^T)$. Then

$$ \begin{aligned} {} & \sigma_*(AB) \\ = & \min \left\{||AB x|| : x \in \mathrm{range}((AB)^T) \land ||x|| = 1 \right\} \\ \geq & \min \left\{||AB x|| : x \in \mathrm{range}(B^T) \land ||x|| = 1 \right\} \\ = & \min \left\{\left\lVert A \frac{Bx}{||Bx||}\right\rVert ||Bx|| : x \in \mathrm{range}(B^T) \land ||x|| = 1 \right\} \\ \geq & \min \left\{\left\lVert A \frac{Bx}{||Bx||}\right\rVert : x \in \mathrm{range}(B^T) \land ||x|| = 1 \right\} \\ \cdot \; & \min \left\{||Bx|| : x \in \mathrm{range}(B^T) \land ||x|| = 1 \right\} \\ \end{aligned} $$

For the first factor,

$$ \begin{aligned} {} & \min \left\{||A \frac{Bx}{||Bx||}|| : x \in \mathrm{range}(B^T) \land ||x|| = 1 \right\} \\ = & \min \left\{||A \frac{Bx}{||Bx||}|| : x \in \mathrm{range}(B^T) \setminus \{0\} \right\} \\ = & \min \left\{||A y|| : y \in \mathrm{range}(BB^T) \land ||y|| = 1 \right\} \\ \geq & \min \left\{||A y|| : y \in \mathrm{range}(A^T) \land ||y|| = 1 \right\} \\ = & \sigma_*(A). \end{aligned} $$

Therefore

$$\sigma_*(AB) \geq \sigma_*(A) \sigma_*(B).$$

Suppose $A \neq 0$, $B \neq 0$, and $\mathrm{range}(A^T A) \subset \mathrm{range}(B)$. By the previous case

$$\sigma_*(AB) = \sigma_*((AB)^T) = \sigma_*(B^T A^T) \geq \sigma_*(B^T) \sigma_*(A^T) = \sigma_*(A) \sigma_*(B).$$

Application

I'll now describe an interesting application for the bound. Consider $||(AB)^+||$, where $||\cdot||$ denotes the operator 2-norm, and $+$ denotes the Moore-Penrose inverse. It is not difficult to show (e.g. by employing singular value decomposition) that $||A^+|| = \sigma_*(A)^+$. Supposing now the relaxed sufficient conditions, we get the nice upper bound

$$||(AB)^+|| \leq ||A^+|| ||B^+||.$$

Note that this result is not trivial, since in general $(AB)^+$ does not equal $B^+ A^+$.

Open question

Is the relaxed condition also necessary?