In what sense are similar matrices "the same," and how can this be generalized?

192 Views Asked by At

I sort of intuitively see why we care about similar matrices, i.e., when $A=S^{-1}BS$ for some invertible matrix $S$. But I want to make this intuition more precise and abstract.

Matrices: First of all, as mentioned here,

Because matrices are similar if and only if they represent the same linear operator with respect to (possibly) different bases, similar matrices share all properties of their shared underlying operator.

This is followed by a long list of shared properties. However, I feel this doesn't give the full story. For example, we also care about when two different operators are similar, in which case (conversely), they can be represented by the same matrix with appropriate bases. In this case, the long list of properties is still shared by both operators. How can one precisely say what type of properties are shared by similar operators, and what's the most abstract way to understand this?

Generalizations: If $A, B, S$ are elements of a group $G$, then $A = S^{-1}BS$ is described by saying $A$ is the conjugation of $B$ by $S$. For any $S \in G$, conjugation by $S$ is an endomorphism of $G$, and hence preserves all group-theoretic properties of any element. Matrix similarity is an extension of this idea, where we conjugate elements of an algebra $L(V)$ (operators) with the group of units in the algebra $GL(V)$ (invertible matrices). This type of conjugation then provides an algebra endomorphism, so we should expect the properties of some $T \in L(V)$ as an element of the algebra $L(V)$ to be preserved by conjugation—but some of the properties in the list linked above (e.g., determinant) are specific to $L(V)$ and cannot be generalized to an arbitrary algebra.

The most general framework I can think of is as follows: We have some group $G$ which acts on some structure $X$ from the left and right. Thus conjugation makes sense. Can we say anything about what properties of an arbitrary $x \in X$ are preserved under conjugation, in a way that includes matrix similarity as a special case?

1

There are 1 best solutions below

3
On BEST ANSWER

Here's one way of approaching matrix similarity, from an abstract POV. Instead of an $n\times n$ matrix $M$, let's consider an arbitrary endomorphism $\phi$ of an $n$ dimensional vector space $V$. An endomorphism $\phi$ of $V$ is equivalent to a $k[x]$ module structure on $V$, where $x\cdot v:=\phi(v)$. From this perspective, similarity of endomorphisms is when the associated modules are isomorphic. So any properties of finite dimensional $k[x]$ modules (of course invariant under isomorphism) are what similar matrices share.

For instance, the structure theory of modules over a PID gives normal form theorems, generalised eigenvalues are the isomorphism classes of simple modules in the composition series, etc.

The category of finite dimensional $k[x]$ modules also has duals, and (symmetric) tensor products, so we can understand transpose and exterior powers without bases, giving characteristic polynomials categorically.

As a concrete example, from this perspective, we can prove easily that a matrix and it's transpose are similar, by interpreting all the pieces of this statement module theoretically.

Understanding when the linear algebra is describing properties of finite dimensional $k[x]$ modules is very helpful, and helps delineate matrix properties from abstract linear endomorphism properties (which may generalised to other settings).