Why the multiplication of a covariance matrix with the inverse of its sum with the identity matrix is symmetric?

241 Views Asked by At

I have an empirical result (meaning it is always true by simple simulation e.g. in R) which I cannot prove to myself:

Let $A$ be a $n \times n$ covariance matrix (i.e. it is symmetric PSD), let $I_n$ be the identity matrix, $\theta_1$ and $\theta_2$ some scalars (in my case they are always positive but it does not matter). Let:

$V = (\theta_1 A + \theta_2I_n)^{-1}A$

It seems that $V$ is always symmetric! Can we prove it?

E.g. in R:

A <- cov(rbind(c(1,2.1,3), c(3,4,5.3), c(3,4.2,0)))
isSymmetric(solve(2 * A + 3 * diag(3)) %*% A)
[1] TRUE

To anyone interested: it is important to me mainly because this means I have two symmetric matrices $A, B$ which multiply to a symmetric matrix $AB$, in which case its eigenvalues are in fact multiplications of the eigenvalues of $A$ and $B$ according to this, which also simplifies its trace.

2

There are 2 best solutions below

1
On BEST ANSWER

$$ V = (\theta_1 A + \theta_2I_n)^{-1}A \implies (\theta_1 A + \theta_2I_n)V = A\\ \implies V^*(\theta_1 A + \theta_2I_n)^* = A^* \implies V^*(\theta_1 A + \theta_2I_n) = A\\ \implies V^*(\theta_1 A + \theta_2I_n)A^{-1} = I_n \implies V^*(\theta_1 I_n + \theta_2A^{-1}) = I_n\\ \implies V^*A^{-1}(\theta_1 A + \theta_2I_n) = I_n \implies V^*A^{-1} = (\theta_1 A + \theta_2I_n)^{-1}\\ \implies V^* = (\theta_1 A + \theta_2I_n)^{-1}A $$

Notice: it is way shorter if you know that $()^*$ and $()^{-1}$ commute


To state a general theorem, given $p(x,y),q(x,y)$ two polynomials (or even any function, if you know how to apply those to matrices), then $p(A,A^{-1})$ and $q(A,A^{-1})$ commute if $A$ is symmetric (Hermitian if complex).

5
On

If $$XY=YX$$ then by multiplying by left and right by $X^{-1}$: $$YX^{-1}=X^{-1}Y$$ Thus since $[(\theta_1 I+\theta_2 A),A]=0$ also $[(\theta_1 I+\theta_2 A)^{-1},A]=0$, thus the product is symmetric (all this assuming all inverses exist etc).