I am working with matrices from the Lie algebra of SL(3, R) and trying to recover a matrix after computing its matrix exponential. When using the Eigen numerical library, I found the result of the logarithm log(exp(A)) deviating from the original matrix A. However, using SciPy in Python, the result seems to be correct. Both libraries are applied on the same matrix and both computations uphold the zero trace condition.
Here are the details:
The original matrix A is from the Lie algebra of SL(3, R), hence it’s a 3x3 real matrix with trace zero. With Eigen, I compute the exponential exp(A) followed by the logarithm log(exp(A)). The resulting matrix does not match A. With SciPy, using scipy.linalg.expm and scipy.linalg.logm, the log(exp(A)) computation recovers matrix A accurately. The misalignment occurs despite the trace being zero and consistent in both the original and recovered matrices when using Eigen. My questions are:
What could explain the discrepancy between the results from Eigen and SciPy, given that both perform calculations on the same matrix? Are there any known differences in the way Eigen and SciPy compute matrix logarithms that could lead to such a discrepancy? How can I ensure the correct branch selection in Eigen, aligning it with the results obtained from SciPy? Any insights on the multivalued nature of the matrix logarithm and how to navigate it in Eigen (or numerical libraries in general) would be greatly appreciated. If anyone is aware of specific settings or strategies that SciPy might use to maintain accuracy that isn’t present in Eigen, that information would be incredibly helpful as well.
Thank you for your assistance!
Here is the Matrix A:
\begin{bmatrix} -4.979256689548492432 & 10.07702589035034180 & 1.415015459060668945 \\ -7.213447690010070801 & 2.406612932682037354 & -6.602815389633178711 \\ -6.363325119018554688 & -18.10230255126953125 & 2.572643756866455078 \end{bmatrix}