From page 95 of Hoffman & Kunze's Linear algebra:
Let $T$ be the linear operator on $\mathbb{R}^2$ defined by
$T(x_1,x_2)=(-x_2,x_1)$
Prove that if $B$ is any ordered basis for $\mathbb{R}^2$ and $[T]_B=A$, then $A_{12}A_{21}\neq0.$
My approach was as follows: First find the matrix of $T$ relative to the standard ordered basis for $\mathbb{R}^2$:
$$A=\begin{bmatrix} 0 & -1\\ 1 & 0\\ \end{bmatrix}$$
Then suppose that $B\prime$ is any other basis for $\mathbb{R}^2$ of the form $B\prime=\{\alpha_1,\alpha_2\}$, where $\alpha_1=(a,b), \alpha_2=(c,d)$. There exists a matrix $P$ such that for a given vector $\alpha$, $[\alpha]_B=P[\alpha]_{B\prime}$ Specifically,
$$P=\begin{bmatrix} a & c\\ b & d\\ \end{bmatrix}$$
The matrix of $T$ in $B\prime$ is computed as follows: $$ A=[T]_{B\prime}=P^{-1}[T]_BP=\frac{1}{ad-bc}\begin{bmatrix} ab-bd & d^2+bc\\ a^2-bc & ac+cd\\ \end{bmatrix} $$ In order for $A_{12}A_{21}=0$, it must be the case that $$ (d^2+bc)(a^2-bc)=(ad)^2-(bc)^2+bc(a^2-d^2)=0 $$ By case analysis, it can be shown that any basis satisfying this condition does not span $\mathbb{R}^2$, contradicting the assumption that $B\prime$ spans $\mathbb{R}^2$.
My problem is the following: This proof seems very tedious. I feel that I am missing some insight that would lead to a much more elegant proof. Is this the case, and if so, what am I missing? I would also appreciate any feedback on my exposition - I am new to this and in high school so I don't really have anyone to get feedback from. Thanks.
Suppose one of $A_{12}$, $A_{21}$ is zero. Can you reason that $T$ has an eigenvector in this case?