Matrix Square Root Singularity

59 Views Asked by At

I have a smooth function $f:\mathbb{R}^2\rightarrow \mathbb{R}$ with $\nabla f = [f_x, f_y]$ and a matrix $$ M = (1 + f_x^2 + f_y^2)^{-1} \begin{bmatrix} 1 + f_y^2 & -f_xf_y & 0 \\ -f_xf_y & 1 + f_x^2 & 0 \\ 0 & 0 & \xi^{-1}(1 + f_x^2 + f_y^2) \end{bmatrix} $$ where $\xi >0$. Note it is symmetric positive definite.

I want the matrix square root of $M$. However, I found: $$ \sqrt{M} = (f_x^2 + f_y^2)^{-1} \begin{bmatrix} f_y^2 +f_x^2Q & f_xf_y(Q-1) & 0\\ f_xf_y(Q-1) & f_x^2 + f_y^2Q &0 \\ 0 & 0 & (f_x^2 + f_y^2)\xi^{-1/2} \end{bmatrix} $$ where $ Q=\sqrt{(1 + f_y^2 + f_x^2)^{-1}} $.

This does indeed satisfy $\sqrt{M}\sqrt{M}=M$ but has a serious problem: $\sqrt{M}$ is inderminate when $f_x, f_y\equiv 0$. I expected $\sqrt{M}=I$ in such a case (assuming $\xi=1$).

How can I fix $\sqrt{M}$ to not blow up when $\nabla f$ vanishes?