Let $\beta$ be an ordered basis of $T$, a linear operator on a finite dimensional vector space $V$. My textbook says we can represent $T$ in matrix form as $[T]_{\beta}$. If $T$ is invertible, then we can write the inverse $T^{-1}$.
We can write it in matrix form as $[T^{-1}]_{\beta}$.
Could we say that $[T^{-1}] = [T]^{-1}$? (I had to remove the $\beta$ subscript for this last line because the latex was acting up, but imagine it is still there).
For context, I am trying to solve a problem that asks us to prove if an invertible linear operator $T$ was diagonalizable, then $T^{-1}$ is diagonalizable. If $T$ is diagonalizable, we can write its matrix representation $[T]_\beta$ as similar to a diagonal matrix :
$$[T]_{\beta} = Q^{-1}BQ$$
where $Q$ is an invertible matrix and $B$ is a diagonal matrix. I want to take inverse of both sides, which is where I get stuck for the question above about if I can say that $[T^{-1}] = [T]^{-1}$.
Yes your assertion is true and to see this you can make use of a very general statement about linear transformations and their associated matrix representations:
Let $U, V, W$ be finite dimensional vector spaces with bases $\beta, \gamma, \delta$. Let $F : U \to V$, $G : V \to W$ be linear transformations with matrices $[F]_\beta^\gamma, [G]_\gamma^\delta$ with respect to those bases. Then the following equation relates linear composition to matrix multiplication: $$ [G \circ F]_\beta^\delta = [G]_\gamma^\delta[F]_\beta^\gamma $$
In particular for your case let $U = V = W$ and $F = T$, $G = T^{-1}$. And of course, in this case, the notation $[T]_\beta$ is a simplification of the more explicit notation $[T]_\beta^\beta$. Now using the above equation, do you see why $([T]_\beta)^{-1} = [T^{-1}]_\beta$?