Rigorous proof of Taylor expansion for matrix square root

145 Views Asked by At

Suppose that $A$ and $B$ are $n\times n$ positive definite diagonal matrices and $B$ is non-random. It is given that $A^2-B^2=O_p(n^{-1/2})$. Why is the following true? $$ A=B+\frac{1}{2}B^{-1}(A^2-B^2)+o_p(n^{-1/2}).\tag{$*$} $$ I can see why ($*$) is true heuristically: $$ A^2=B^2+(A^2-B^2)=B^2[I+B^{-2}(A^2-B^2)]. $$ Then: \begin{align*} A{\color{red}=}\sqrt{A^2}&=B\sqrt{I+B^{-2}(A^2-B^2)}\\ &{\color{red}=}B\left[I+\frac{1}{2}B^{-2}(A^2-B^2)-\frac{1}{8}\Big(B^{-2}(A^2-B^2)\Big)^2+\cdots+\right]\\ &=B+\frac{1}{2}B^{-1}(A^2-B^2)-\frac{1}{8}B^{-1}\Big(B^{-2}(A^2-B^2)\Big)^2+\cdots\\ &{\color{red}=}B+\frac{1}{2}B^{-1}(A^2-B^2)+o_p(n^{-1/2}). \end{align*} I'm unsure about the equalities colored in red above.

1

There are 1 best solutions below

0
On

Since $A,B$ are diagonal, they commute and all obviously works; the residual term is $O(1/n)$ if $B$ is a constant matrix.