Imagine the matrix $A$ and the extended matrix B: \begin{equation*} B = \begin{pmatrix} A & 0 \\ 0 & 0 \end{pmatrix} \end{equation*}
Now how does the logarithm of both matrices relate?
I have done some testing using from scipy.linalg import logm.
- if you try:
A= np.array([[0.572, 0.180, 0.814, 0.085, 0.065],
[0.462 , 0.462, 0.167, 0.027, 0.997],
[0.735, 0.919, 0.353, 0.939, 0.895],
[0.763, 0.931, 0.511, 0.450, 0.186],
[0.144, 0.428 , 0.314, 0.077, 0.081]])
B= np.array([[0.572, 0.180, 0.814, 0.085, 0.065, 0.0],
[0.462 , 0.462, 0.167, 0.027, 0.997, 0.0],
[0.735, 0.919, 0.353, 0.939, 0.895, 0.0],
[0.763, 0.931, 0.511, 0.450, 0.186, 0.0],
[0.144, 0.428 , 0.314, 0.077, 0.081, 0.0],
[0.0, 0.0 , 0.0, 0.0, 0.0, 0.0]])
you will notice that the difference between logm(A) and logm(B) is only a row and a column of zeros except for the corner value which becomes always -4.60517019e+01.
- Similar observation can be noticed if you manipulate the identity matrix and set one or more of the diagonals to zero, the logm would produce -4.60517019e+01 in the corresponding diagonals.
Even in the complex setting, a matrix has a logarithm iff it is invertible. Since $B$ has a row and column of zeros, it is not invertible and has no matrix logarithm at all.
$-46.0517109$ should have tipped you off that SciPy is cheating: it is $\log10^{-20}$. It has been deliberately added to produce a result, which is therefore incorrect.