For certain positive semidefinite matrices, subtracting the outer product of their row-sums does not change the positive semidefiniteness

627 Views Asked by At

Let $e$ denote the vector of all ones, $J=ee^T$ and $\langle A,B\rangle = trace(AB^T)$.
Consider a symmetric positive semidefinite (psd) matrix $A\geq 0$ (that is, $a_{ij} \geq 0$ for all entries) where $\langle A, J\rangle =1$. Then the row-sums of $A$ are given as $Ae$.
I wonder if the matrix \begin{bmatrix} 1& e^TA\\Ae &A\end{bmatrix} is always psd, because then I could just use $A$ for my psd-programming problem. By the Schur complement, this is equivalent to the question whether $$A-Aee^TA = A-AJA$$ is always psd. Initially I thought this is not the case, but I couldn't construct a counterexample so by now I think it should hold.

Has anyone an idea how to tackle this, or is there already a related result? I've been trying for several hours yesterday, but hopefully I oversaw something straight forward.

I'm also fine with a counterexample, of course.


EDIT: So this is my current approach for the special case of invertible $A$:

By multiplying $A^{-1}$ on both sides, we get that $$ A^{-1}-J$$ should be psd. We can check this by checking the nonnegativity of the associated quadratic form.

For arbitrary $x$, we have a decomposition $x=v + \mu e$, where $\langle v,e\rangle =0$. Then $$x^T (A^{-1}-J) x =[ v^TA^{-1}v ]+ \mu^2[ e^T A^{-1}e - d^2] $$ where $d$ is the dimension of $A$ and where we can ignore the left bracket because $A^{-1}$ is psd as well.

So the question actually boils down to 2 steps:

  1. When $A\geq 0$, $A$ psd, $\langle A, J\rangle =1$, is it always true that $\langle A^{-1}, J\rangle \geq d^2$?

  2. Does the case for arbitrary rank of $A$ follow from the full rank?

2

There are 2 best solutions below

1
On BEST ANSWER

It is true. Let $R$ be the symmetric root of $A$, which exists positive for semidefinite matrices. That is, $R$ is symmetric, positive semidefinite, and $R^2=A$.

For $x\in\mathbb R^n$ we have $$ x^T A x - x^T A e e^T A x \ge 0 $$ if and only if $$ \|Rx\|^2 \underbrace{\|Re\|^2}_{\langle A,J \rangle=1} = x^T A x \ge x^T A e e^T A x = ((Rx)^T (Re))^2, $$ which is true by Cauchy-Schwarz.

0
On

Part 1 is covered by A theorem of symmetric positive definite matrix.

Part 2 should follow if you consider $A+\epsilon I$ and let $\epsilon$ go toward 0.