Spectral Radius of A+B

521 Views Asked by At

I am solving a physical problem numerically which gives three real, symmetric and positive semi-definite matrices: $A$, $A_1$, and $A_2$; where $A=A_1+A_2$.

I know that the following identities exist \begin{equation} \|A\|_2 \leq \|A_1\|_2 + \|A_2\|_2 \end{equation} \begin{equation} \rho(A) \leq \rho(A_1) + \rho(A_2). \end{equation} However, I have always got $\rho(A) \leq max\{ \rho(A_1) , \rho(A_2)\}$ for different parameters and discretizations of the problem during extensive numerical studies (it has NOT been valid for norms). I am trying to find a mathematical justification for this. Any hint can be helpful.

Thanks,

Edit: I know this is not always true. However, there should exist some certain conditions (which are imposed on the matrices during the simulation) under which this is valid. I am looking for those conditions.

1

There are 1 best solutions below

0
On

If you add the condition that the eigenspaces of $A_1$ and $A_2$ are orthogonal, you will have better luck proving these sorts of bounds for the Schatten norms.

If you just want to prove it for the spectral norm, you should only need the eigenspaces of the extreme eigenvalues to be orthogonal.