Relationship between invertible matrices and polynomials

582 Views Asked by At

Assume we have a matrix $B$ with dimensions $n \times n$ over the field $K$.

Prove if matrix $B$ is invertible, then $B^{-1} = f(B)$ for some polynomial $f \in K[X]$.

I understand I have to express $f(B)$ in form of matrix product or whatsoever.

Something like $f(A) = C + D \cdot F$, where $C, D, F$ are some matrices, but I don't know what are these matrices.

1

There are 1 best solutions below

3
On BEST ANSWER

The space of $n \times n$ linear operators is finite dimensional, so there must be a smallest $k$ such that the monomials $I_n, B, B^2, \ldots, B^k$ are linearly dependent ($n^2$ is a very crude upper bound, but in the derivation of the minimal polynomial you actually prove the bound is $n$).

The existence of such a nontrivial linear dependence relation gives you a polynomial relation $$g(B) = a_0I_n + \sum_{j=1}^{k} a_jB^j = 0$$ with $a_j \in K$.

Now can you use the fact that $B$ is invertible to play with the polynomial $g$ a little bit and get a relation $f(B) = B^{-1}$ as desired?

(Note it's especially simple if $a_0$ is nonzero, but not a problem if it's not.)

Hint if you want

There must be more than one nonzero coefficient in the relation. If $a_i$ was the only nonzero term, then applying $B^{-i}$ to both sides would yield immediately the contradiction $a_i = 0$.