I was trying to remember how to show that any invertible matrix has a (possibly complex) logarithm. I thought what I came up with was kind of cool, so I thought I'd post my answer here.
2026-04-07 09:23:41.1775553821
Show that any invertible matrix has a logarithm.
474 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in MATRICES
- How to prove the following equality with matrix norm?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Powers of a simple matrix and Catalan numbers
- Gradient of Cost Function To Find Matrix Factorization
- Particular commutator matrix is strictly lower triangular, or at least annihilates last base vector
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Form square matrix out of a non square matrix to calculate determinant
- Extending a linear action to monomials of higher degree
- Eiegenspectrum on subtracting a diagonal matrix
- For a $G$ a finite subgroup of $\mathbb{GL}_2(\mathbb{R})$ of rank $3$, show that $f^2 = \textrm{Id}$ for all $f \in G$
Related Questions in LOGARITHMS
- Confirmation of Proof: $\forall n \in \mathbb{N}, \ \pi (n) \geqslant \frac{\log n}{2\log 2}$
- Extracting the S from formula
- How to prove the following inequality (log)
- Rewriting $(\log_{11}5)/(\log_{11} 15)$
- How to solve this equation with $x$ to a logarithmic power?
- Show that $\frac{1}{k}-\ln\left(\frac{k+1}{k}\right)$ is bounded by $\frac{1}{k^2}$
- Why do we add 1 to logarithms to get number of digits?
- Is my method correct for to prove $a^{\log_b c} = c^{\log_b a}$?
- How to prove the inequality $\frac{1}{n}+\frac{1}{n+1}+\cdots+\frac{1}{2n-1}\geq \log (2)$?
- Unusual Logarithm Problem
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
It suffices to show that each $\lambda$-Jordan block has a logarithm if $\lambda \neq 0$.
First note that the exponential of a Jordan block is $$ \left[\begin{matrix} \lambda & 1 & 0 & \dotsb & 0 \\ & \ddots & \ddots & \ddots & \vdots \\ & & \ddots & \ddots & 0 \\ & & & \ddots & 1 \\ & & & & \lambda\\ \end{matrix}\right] \overset{\exp}{\mapsto} \left[\begin{matrix} e^{\lambda} & e^{\lambda} & \frac{e^{\lambda}}{2!} & \dotsb & \frac{e^{\lambda}}{(k-1)!} \\ & e^{\lambda} & \ddots & \ddots & \vdots \\ & & \ddots & e^{\lambda} & \frac{e^{\lambda}}{2!} \\ & & & e^{\lambda} & e^{\lambda} \\[7pt] & & & & e^{\lambda} \\ \end{matrix}\right]. $$ So, reversing the process, given a Jordan block $$J=\left[\begin{matrix} a & 1 & \dotsb & 0 \\ & a & \ddots & \vdots \\ & & \ddots & 1 \\ & & & a \end{matrix}\right] $$ with $a\neq 0$, we want to show that $J$ is similar to $$ \left[\begin{matrix} a\quad & a & \frac{a}{2!} & \dotsb & \frac{a}{(k-1)!} \\ & a & \ddots & \ddots & \vdots \\ & & \ddots & a & \frac{a}{2!} \\ & & & a & a \\[7pt] & & & & a \\ \end{matrix}\right], $$ since we know how to find the logarithm of the above matrix, and since $\log(UMU^{-1})=U\log(M)U^{-1}$. Now, since the scalar matrix $aI$ has the same form with respect to any basis, we can neglect this part, and just ask if we can conjugate $$\left[\begin{matrix} 0 & 1 & \dotsb & 0 \\ & 0 & \ddots & \vdots \\ & & \ddots & 1 \\ & & & 0 \end{matrix}\right] \tag{$M_1$} $$ into $$ \left[\begin{matrix} 0\quad & a & \frac{a}{2!} & \dotsb & \frac{a}{(k-1)!} \\ & 0 & \ddots & \ddots & \vdots \\ & & \ddots & a & \frac{a}{2!} \\ & & & 0 & a \\[7pt] & & & & 0 \\ \end{matrix}\right]. \tag{$M_2$} $$ First of all, we can see algebraically that these two matrices should be similar, since $M_2= aN + aN^2 + \dots + \dfrac{a}{(k-1)!}N^{k-1}$, where $N$ is the elementary nilpotent matrix of size $k$ (here $N=M_1$, incidentally). Taking powers of $M_2$ shows that $M_2^{k-1}\neq 0 \,\, M_2^k=0$. Therefore $M_1$ and $M_2$ have the same minimal polynomial $x^k$. I claim they also have the same characteristic polynomial, also $x^k$. The only possibility for the invariant factors of $M_1$ and $M_2$ is $1, 1, \dotsc, 1, x^k$. Therefore they are similar.
That said, I thought I would tackle the more general problem: Suppose we want to conjugate $$\left[\begin{matrix} 0 & 1 & \dotsb & 0 \\ & 0 & \ddots & \vdots \\ & & \ddots & 1 \\ & & & 0 \end{matrix}\right] \tag{$M_3$}$$ into $$\left[\begin{matrix} 0 & a_{12} & a_{13} & \dotsb & a_{1k} \\ & 0 & a_{23} & \dotsb & a_{2k} \\ & & \ddots & \ddots & \vdots \\ & & & \ddots & a_{k-1,k}\\ & & & & 0 \end{matrix}\right]\tag{$M_4$}$$
Given that $\mathcal{B}=\{v_1, \dotsc, v_k\}$ is an ordered basis such that $M_3 = [T]_{\mathcal{B}\mathcal{B}}$, conjugating $M_3$ into $M_4$ is equivalent to finding an ordered basis $\mathcal{C} = \{w_1, \dotsc, w_k\}$ such that $[T]_{\mathcal{C}\mathcal{C}}=M_4$.
Let $v_1, \dotsc, v_k$ be the basis in $M_1$. Define $w_1, \dotsc, w_k$ as follows: let $w_1=v_1$. Now suppose $$w_{i-1}=\sum_{1\leq j\leq i-1}c_jv_{j}$$ Then let $$w_i = \sum_{1\leq j\leq i-1}a_{n-j}c_jv_{j+1}.$$
We need to make sure $w_i$ are linearly independent. I claim they will be iff $M_3$ is similar to $M_4$ iff $M_4^{k-1}\neq 0$. One condition which will guarantee this is: $$\text{All elements on the superdiagonal of $M_4$ are nonzero.}$$ Perhaps someone can come up with some better conditions for it.