Let $A$ and $B$ be both positive semidefinite matrices of the same size, and that $A A' \le B B'$. Let $C > 0$ be another positive definite matrix. Does this imply that $A C A' \le B C B'$?
Matrix inequality with square products
263 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
Edit:
My discussion with Bertrand allowed me to find several mistakes in my reasoning and formalize some of the results I arrived at. More specifically, I was able to show(* the proof is not full) that the claim holds under the stronger assumption: $\sigma_{\max}(A^2)\sigma_{\max}(C) \leq \sigma_{\min}(B^2)\sigma_{min}(C)$. This can be rewritten equivalently through the condition number of $C$: $\kappa(C)\leq \frac{\sigma_{\min}(B^2)}{\sigma_{\max}(A^2)}$. Note that this stronger assumption implies: $\sigma_{\max}(A^2)\leq \sigma_{\min}(B^2)$, since $C$ is PD: $0<\sigma_{\min}(C)\leq \sigma_{\max}(C)$. This in turn implies $AA'\leq BB'$. This is easy to see, since (I can prove the claim below in more details if anyone is interested, it follows from a result in convex analysis):
$$\forall x : |x|=1 \rightarrow x'AA'x \leq \max_{u:|u|=1}{uAA'u} = \sigma_{\max}(A^2) \leq \sigma_{\min}(B^2) = \min_{u:|u|=1}{uBB'u} \leq x'BB'x$$
(Note that the requirement for $|x|=1$ is equivalent since we can divide both sides by $|x|^2$ of the original formulation to get it).
Finally we have:
$$x'ACA'x \leq \max_{u:|u|=1}{u'ACA'u} \leq \sigma_{\max}(A^2)\sigma_{\max}(C) \\ \leq \sigma_{\min}(B^2)\sigma_{min}(C) \leq \min_{u:|u|=1}{u'BCB'u} \leq x'BCB'x$$
Note that since $A,B,C$ are PSD, then the following eigendecomposition is valid: $$A=QKQ', B=PLP', C=R\Lambda R'$$ Where $Q,P,R$ are orthogonal matrices (made up of the eigenvectors), and $K,L,\Lambda$ are diagonal matrices (made up of the corresponding eigenvalues). The inequality $\max_{u:|u|=1}{u'ACA'u} \leq \sigma_{\max}(A^2)\sigma_{\max}(C)$ holds because:
$$\max_{u:|u|=1}{u'ACA'u} \leq \max_{u,Q,R:|u|=1, QQ'=I, RR'=I}{u'ACA'u} = \sigma_{\max}(A^2)\sigma_{\max}(C)$$
The last holds since the matrix $ACA'$ can't scale the unit vector more than $\sigma_{\max}(A^2)\sigma_{\max}(C)$. Unfortunately I haven't proved it formally, my handwavy 'proof' relies on this decomposition:
$$u'ACA'u = (R'QKQ'u)'\Lambda(R'QKQ'u)$$
It goes something like this: we are allowed(due to the maximum) to choose $u$, then I can pick $u=(1,0,...,0)$, then since we are allowed to pick the rotation(due to the maximum) $Q$, we can align it with the eigenvector corresponding to the square root of the maximum eigenvalue of $A^2$, and multiply it with $K$ to scale it by $\sqrt{\sigma_{\max}(A^2)}$. Then by choosing $R$ appropriately (due to the maximum) we can align it with the eigenvector of $C$ corresponding to the maximum eigenvalue of $C$, thus additionally scaling it by $\sigma_{\max}(C)$, this happens on both sides, so after the dot product one gets the squared length (only one of the sides gets multiplied by $\sigma_{\max}(C)$ though). My argument for $u'BCB'u$ would be similar.
If one can show that $AA'\leq BB'$ does not imply the stronger assumption (which I believe it does not), and also prove that the inequalities become equalities for specific $A,B,C$, then it would follow that $A,B$ PSD, $C$ PD, and $AA'\leq BB'$ are not enough for the claim.
The answer is clearly no. Just consider the special case where $B$ is positive definite. The condition $A^2\le B^2$ is then equivalent to $\|B^{-1}A\|_2\le1$, while the condition $ACA\le BCB$ is equivalent to $\|C^{-1/2}(B^{-1}A)C^{1/2}\|_2\le1$. Since matrix norm can be increased or decreased by similarity transform, none of the two conditions implies the other in general. E.g. when $$ A=\frac12\pmatrix{1&1\\ 1&1},\ B=I,\ C=\pmatrix{1\\ &4}, $$ we have $B^2-A^2=\frac12\pmatrix{1&-1\\ -1&1}\ge0$ but $BCB-ACA=\frac14\pmatrix{-1&-5\\ -5&11}$ is indefinite. On the other hand, if we define $A_1=C^{1/2}AC^{1/2},\ B_1=C^{1/2}BC^{1/2}$ and $C_1=C^{-1}$ in the above, then $B_1C_1B_1-A_1C_1A_1\ge0$ but $B_1^2-A_1^2$ is indefinite.