Computing the derivative of square root of a matrix

3.3k Views Asked by At

maybe this is an idiot question, however I could not figure out how to solve it. Let $X =M_n(\mathbb{R})$ be the space of $n \times n$ matrix over the reals, then there exists two open neighborhoods of the identity, $U$ and $V$, such that the function $\phi: V \longrightarrow U$, $\phi(A) = \sqrt{A}$ is well defined and is differentiable at the identity $I$.Furthermore, what's $d\phi(I)(T)$ ? I was thinking in inverting the matrix $A = I - B$ by the usual $\sum_i B^{i}$ and then, somehow, find the unique square root.

Thanks in advance.

1

There are 1 best solutions below

10
On BEST ANSWER

This looks like a straightforward application of inverse function theorem. Consider $$\psi \colon M_n(\mathbb{R}) \to M_n(\mathbb{R}); A \mapsto A^2$$

Then $\psi$ is continuously differentiable everywhere with derivative $d\psi(A)(T) = AT+TA$. At $A = I$, this is $T \mapsto 2T$, which is invertible.

Now conclude by the inverse function theorem (statement taken from wikipedia):

If the total derivative of a continuously differentiable function $F$ defined from an open set $U$ of $\mathbb{R}^n$ into $\mathbb{R}^n$ is invertible at a point $p$ (i.e., the Jacobian determinant of $F$ at $p$ is non-zero), then $F$ is an invertible function near $p$. That is, an inverse function to $F$ exists in some neighborhood of $F(p)$. Moreover, the inverse function $F^{-1}$ is also continuously differentiable.

Further, $$J_{F^{-1}}(F(p)) = [J_F(p)]^{-1}$$ where $[\cdot]^{-1}$ denotes matrix inverse and $J_G(q)$ is the Jacobian matrix of the function $G$ at the point $q$.