I wonder, what is the meaning of sign function generalized to matrices, as in the CAS Mathematica, for instance?

69 Views Asked by At

In Mathematica it is possible to apply arbitrary functions to square matrices (not element-wise, but using matrix rules).

This seemingly works with Sign function: MatrixFunction[Sign, A]. For instance, this method gives

$\text{sign}\left( \begin{array}{cc} 1 & -8 \\ 1 & 7 \\ \end{array} \right)=1$

(e.g., produces an identity matrix) but

$\text{sign}\left( \begin{array}{cc} 1 & -8 \\ -1 & 7 \\ \end{array} \right)=\left( \begin{array}{cc} -\frac{3}{\sqrt{17}} & -\frac{8}{\sqrt{17}} \\ -\frac{1}{\sqrt{17}} & \frac{3}{\sqrt{17}} \\ \end{array} \right)$

The function can be even applied to some zero divisors.

I wonder, what can be the meaning and properties of sign function, applied to matrices this way?

Can the matrices that have sign equal to $1$ be considered "positive" in some sense, even if they have negative elements?

I also noticed that when applied to split-complex numbers in matrix form it takes 9 possible values: $0,1,-1, j, -j, 1/2+j/2,1/2-j/2, -1/2+j/2, -1/2-j/2$. This is in contrast to complex numbers, where the set of values is infinite.

When applied to dual numbers, it gives 5 different values.

The usual rule $\text{sign } (AB)=\text{sign }A\cdot \text{sign } B$ still holds though.

Can a space be categorized into how many values the sign function can take there, into infinitely-signed and finitely-signed?

1

There are 1 best solutions below

5
On

I'm not sure what Mathematica is doing here, but if it was Maple it would be applying the holomorphic functional calculus. This defines $f(A)$ whenever $f$ is analytic in a neighbourhood of the eigenvalues of matrix $A$. If $D$ is a diagonal matrix, $f(D)$ is diagonal with diagonal elements the values of $f$ on the diagonal elements of $D$, and $f(S D S^{-1}) = S f(D) S^{-1}$. In particular, if this is what is being used, $f(A)$ will be the identity matrix if the eigenvalues of $A$ are positive reals.