Showing $A-\alpha Acc^T A$ is positive semi-definite

56 Views Asked by At

I'm trying to show that for a positive semi-definite (PSD) matrix $A$, arbitrary vector $\bf{c}$, and positive scalar $r$, the matrix

\begin{align} f(A)\triangleq A-\frac{1}{c^T A c + r}Acc^TA \end{align}

is also PSD. I believe that this statement is true for 2 reasons. First, this is reminiscent of the filter step of the Kalman filter, where $A$ is the apriori error covariance matrix, and $f(A)$ is the aposteriori error covariance matrix, which should be PSD. Second, I evaluated this equations tens of millions of times numerically with random choices of all variables, and every time showed that $f(A)$ was indeed PSD.

Here are some of my attempts at the proof:

  1. When $A$ is a scalar, we can multiply $f(A)$ by $c^TAc+r$ to obtain

\begin{align} (c^TAc+r)f(A) &= c^2 A^2 + rA - c^2 A^2 \\ &= rA \\ &\geq 0. \end{align}

  1. When $A$ is a matrix, we need to show that $f(A)$ has only positive eigenvalues. So I tried finding a simultaneous diagonalization of $A$ and $\frac{1}{c^T A c + r}Acc^TA$ and failed because these 2 matrices do not commute.

  2. I tried diagonalizing $A$ as $O^T D O$ so that I can write

\begin{align} f(A) = O^T \left( D - \frac{1}{c^T A c + r} DOcc^T O^T D\right) O \end{align}

but I got nowhere.

Any hints on writing down the proof?

1

There are 1 best solutions below

3
On BEST ANSWER

Hit the matrix on both sides by an arbitrary vector $v$:

\begin{align*} v^T f(A) v &= \langle v, v\rangle - \frac{\langle c, v\rangle^2}{\langle c, c\rangle + r}\\ &= \frac{r\langle v,v\rangle + \langle v,v\rangle\langle c,c\rangle - \langle c,v\rangle^2}{\langle c, c\rangle + r}\\ &\geq \frac{r\langle v,v\rangle}{\langle c, c\rangle + r}\\ &\geq 0, \end{align*} where $\langle u,v\rangle = u^TAv$ is the inner product induced by $A$ and the key middle inequality is by Cauchy-Schwarz.