Factoring a matrix out of linear matrix equation

1.1k Views Asked by At

I'm having a bit of trouble following a solution in a textbook, one step in particular.

I have the equation $(Z + tV)^{-1}$ where $Z$, $V$ are matrices and $t$ is a scalar. $Z$ is positive definite, and $V$ is symmetric. The next step implies that

$$ (Z + tV)^{-1} = Z^{-1}(I + tZ^{-1/2}VZ^{-1/2})^{-1} $$

Now I can understand factoring out $Z$ first, i.e. $(Z + tV) = Z(I + tZ^{-1/2}VZ^{-1/2})$, but then I thought $(AB)^{-1} = B^{-1}A^{-1}$, i.e. the order switches (and given that both $A$ and $B$ are invertible). Maybe then $Z$ is factored out on the right first, then that property is applied?

Thanks for any help, I'm clearly not too hot on linear algebra.