Bound for singular value of the difference of two positive semidefinite matrices

263 Views Asked by At

Let $A$ and $B$ two positive semidefinite $n$-by-$n$ matrices. I am interested if I can bound $$ \sigma_1 (A-B), $$ the largest singular value of $A-B$, by a function (only) of $\lambda_1(A)-\lambda_1(B)$, $\dots$, $\lambda_n(A)-\lambda_n(B)$, where $\lambda_k (A)$ is the $k$th largest eigenvalue of $A$. I can derive the following bound $$ \sigma_1(A-B) \leq \max \{|\lambda_1(A) - \lambda_n(B)|, |\lambda_n(A) - \lambda_1(B)| \}. $$

I think this is possible if $A$ and $B$ are simultaneously diagonalizable and the two can be ordered with respect to the Loewner partial ordering, but I'm wondering if there are weaker conditions that make this possible. Thanks in advance.

EDIT: It turns out that my question was not clear at all. I meant to say that, while I do know that the bound $$ \sigma_1(A-B) \leq \max \{|\lambda_1(A) - \lambda_n(B)|, |\lambda_n(A) - \lambda_1(B)| \}. $$ holds, I was wondering if there is a bound that depends only on $\lambda_1(A)-\lambda_1(B)$, $\dots$, $\lambda_n(A)-\lambda_n(B)$. Thanks again.

1

There are 1 best solutions below

0
On

We have the following (not very good) bound. Let $\|A\| = \sigma_1(A)$. Let $\alpha_j = \lambda_j(A)$ and $\beta_j = \lambda_j(B)$. Let $x_j$ denote a unit eigenvector of $A$ associated with $\alpha_j$, and let $y_j$ denote a unit eigenvector of $B$ associated with $\beta_j$. We note that $$ A = \sum_j \alpha_j x_jx_j^T, \quad B = \sum_j \beta_j y_jy_j^T. $$ We note that $$ \alpha_j x_jx_j^T - \beta_j y_jy_j^T = \alpha_j(x_jx_j^T - y_j y_j^T) + (\alpha_j - \beta_j)y_jy_j^T. $$ We find that $$ x_jx_j^T - y_jy_j^T = \pmatrix{x_j & -y_j}\pmatrix{x_j & y_j}^T,\\ M_j = \pmatrix{x_j & y_j}^T\pmatrix{x_j & -y_j} = \pmatrix{x_j^Tx_j & -x_j^Ty_j\\y_j^Tx & -y_j^Ty_j} = \pmatrix{1 & -x_j^Ty_j\\x_j^Ty_j & -1}. $$ The the non-zero eigenvalues of $x_jx_j^T - y_jy_j^T$ are the eigenvalues of $M_j$, which are $\pm \sqrt{1 - (x_j^Ty_j)^2}$. It follows that $$ \|x_jx_j^T - y_jy_j^T\| = \sqrt{1 - (x_j^Ty_j)^2} \leq 1. $$ Putting things together, we have $$ \begin{align} \|A - B\| &= \left\|\sum_j (\alpha_j x_jx_j^T - \beta_j y_jy_j^T) \right\| \\ &= \left\|\sum_j \alpha_j(x_jx_j^T - y_j y_j^T) + (\alpha_j - \beta_j)y_jy_j^T \right\| \\ & \leq \sum_j (\alpha_j\|x_jx_j^T - y_j y_j^T\| + |\alpha_j - \beta_j|\cdot \|y_jy_j^T\|) \\ & = \sum_j (\alpha_j\|x_jx_j^T - y_j y_j^T\| + |\alpha_j - \beta_j|) \\ & \leq \sum_j (\alpha_j + |\alpha_j - \beta_j|) = \operatorname{trace}(A) + \sum_{j}|\lambda_j(A) - \lambda_j(B)|. \end{align} $$


A possible slight improvement: we have $$ \alpha x_jx_j^T - \beta y_jy_j^T = \pmatrix{\alpha x_j & -\beta y_j}\pmatrix{x_j & y_j}^T,\\ P_j = \pmatrix{x_j & y_j}^T\pmatrix{\alpha x_j & -\beta y_j} = \pmatrix{\alpha x_j^Tx_j & -\beta x_j^Ty_j\\ \alpha y_j^Tx & -\beta y_j^Ty_j} = \pmatrix{\alpha & -\beta x_j^Ty_j\\ \alpha x_j^Ty_j & -\beta}. $$ The eigenvalues of this matrix are the solutions to $$ \lambda^2 - (\alpha - \beta)\lambda - \alpha \beta(1 - (x_j^Ty_j)^2) \implies\\ \lambda = \frac{(\alpha - \beta) \pm \sqrt{(\alpha - \beta)^2 + 4\alpha \beta (1 - (x_j^Ty_j^T)^2)}}{2}. $$ Suppose that $\alpha > \beta$. We have $$ \sqrt{(\alpha - \beta)^2 + 4\alpha \beta (1 - (x_j^Ty_j^T)^2)} = \sqrt{(\alpha + \beta)^2 - 4 \alpha \beta (x_j^Ty_j)^2} \leq \sqrt{(\alpha + \beta)^2} = \alpha + \beta, $$ so that the maximal eigenvalue (and hence singular value) of $\alpha_j x_jx_j^T - \beta_j y_jy_j^T$ is $\max\{\alpha_j,\beta_j\}$.