Non-integral power of a singular matrix

812 Views Asked by At

I know, that if $A$ is nonsingular matrix, so $\det{A} \ne 0$, then $A^p=\exp\left(p\ln A\right)$ is true for any real exponent, but what about if $A$ is singular? Then $A$ has a zero eigenvalue, so the matrix logarithm doesn't exist.

Is there any extension in this case?

2

There are 2 best solutions below

0
On BEST ANSWER

You cannot do it in general, even for positive powers. For example, suppose $N$ is a nilpotent matrix of order $n > 1$ on $\mathbb{R}^{n}$, i.e., $N^{n}=0$, but $N^{n-1}\ne 0$. If $M=\sqrt{N}$, then you would have $M^{2n}=0$, but $M^{2n-2}\ne 0$, which is impossible for an $n\times n$ matrix because the degree of the minimal polynomial cannot exceed $n$.

Because there has been no activity on this question, I thought I'd give you the simplest example. $$ A = \left[\begin{array}{cc} 0 & 1 \\ 0 & 0\end{array}\right] $$ This matrix satisfies $A^{2}=0$, but $A\ne 0$. So the minimal polynomial for $A$ is $m(\lambda)=\lambda^{2}$. If there is a $2\times 2$ matrix $B$ such that $B^{2}=A$, then $B^{4}=0$, but $B^{2}\ne 0$. That's impossible, though, because the minimal polynomial $q$ of $B$ must divide $\lambda^{4}$, and cannot have order greater than $2$. That's a contradiction because $B^{2}\ne 0$. You can try directly, and see that you cannot get $B$ such that $B^{2}=A$, but you know it from these general concerns. Try it: $$ B = \left[\begin{array}{cc} a & b\\ c & d \end{array}\right] \left[\begin{array}{cc} a & b\\ c & d \end{array}\right] = \left[\begin{array}{cc}a^{2}+bc & ab+bd \\ ca+dc & cb+d^{2}\end{array}\right] = \left[\begin{array}{cc} 0 & 1 \\ 0 & 0 \end{array}\right]. $$ Because $a^{2}+bc=0$ and $cb+d^{2}=0$, then $a^{2}-d^{2}=0$. So $a=\pm d$. But $a\ne -d$ because that would contradict $ab+bd=(a+d)b=1$. So $a=d$ must hold. The lower left corner is $c(a+d)=0$. But $a+d\ne 0$ because $(a+d)b=1$. So $c=0$. Now that forces $0=a^{2}+bc=a^{2}$ which gives $a=0$ and, thus, $d=0$, which gives $ab+bd=0$, yielding the desired contradiction. So, no way does there exist $B$ such that $B^{2}=A$ (you can see it abstractly, you can see it concretely.)

0
On

If $A$ is diagonalizable and $A=SDS^{-1}$ a general way to define a continuous function on it is to set $f(A)=Sf(D)S^{-1}$, where $f(D)$ has $f$ applied to the eigenvalues on the diagonal. So $A^p=SD^pS^{-1}$ ($0^p=0$ for any $p>0$).

If $A$ is non-diagonalizable one can use the Jordan canonical form $J$ instead of $D$, so $f(A)=Sf(J)S^{-1}$, but then one also has to define $f(J_\lambda)$ for Jordan cells $J_\lambda$ with $\lambda$ on the diagonal. The catch is that it's not enough for $f$ to be continuous anymore, if $J_\lambda$ is $n\times n$ one needs $n-1$ derivatives at $\lambda$, see http://en.wikipedia.org/wiki/Matrix_function#Jordan_decomposition.

For $f(z)=z^p$ with fractional $p$ enough derivatives at $0$ exist only if $p>n-1$, so $A^p$ is defined if and only if $p>n-1$, where $n$ is the size of the largest nilpotent cell in its Jordan canonical form. In particular, $\begin{pmatrix}0&1\\0&0\end{pmatrix}^{1/2}$ is undefined, but $\begin{pmatrix}0&1\\0&0\end{pmatrix}^{3/2}=\mathbf{0}$, in fact $\begin{pmatrix}0&1\\0&0\end{pmatrix}^{p}=\mathbf{0}$ for any $p>1$, it becomes nil sooner than the integer powers suggest.

Another issue is that even for non-singular matrices the logarithm, and fractional powers, are not uniquely defined. If your $1\times1$ matrix is $(-1)$, then $(-1)^{1/2}$ could mean $(i)$ or $(-i)$. You can make your choice continuously for complex numbers, but only at the expense of banning certain complex numbers as arguments. Such a choice is called "branch" in complex analysis, and banned numbers form a "branch cut". For example, setting $z^p:=|z|^pe^{ip\theta}$ for $z=|z|e^{i\theta}$ and $\theta\in(-\pi,\pi)$ would ban negative real numbers. Actually, you can include even them, by allowing either $\pi$ or $-\pi$, but then your function will not be continuous along the negative half-axis. Once you made your choice of a branch for $z^p$ you can extend it to matrices, but again it will be discontinuous at matrices with eigenvalues on the branch cut.