I have a linear operator $T$ which acts on the vector space of square $N\times N$ matrices in this way:
$T(A)=0.5(A-A^\mathrm{t})$
($A^\mathrm{t}$: the transpose of a matrix $A$).
I need to prove that this operator is diagonalizable.
I know that a linear transformation is diagonalizable if and only if there is a basis for the vector space that consists entirely of eigenvectors.
I want to find the possible eigenvalues for this operator, so by definition I need to see what are the $\lambda$s that imply $t(v)=\lambda v$,and by assignment I get: $A(1-2\lambda)=A^\mathrm{t}$.
How can I go on from here?
Thank you guys.
An alternative way to proceed if you don't know about symmetric and skew-symmetric matrices (or projections) and you don't want to try to find the characteristic polynomial for arbitrary $N$ is the following:
Start with the general observation: if $\mathbf{x}$ is an eigenvector of $T$ corresponding to the eigenvalue $\lambda$, then $\mathbf{x}$ is also an eigenvector of $T^2$ ($T$ composed with itself) corresponding to $\lambda^2$:
Indeed, we have that $\mathbf{x}\neq\mathbf{0}$ (since $\mathbf{x}$ is an eigenvector of $T$), and \begin{align*} T^2(\mathbf{x}) &= T(T(\mathbf{x})) = T(\lambda\mathbf{x})\\ &= \lambda T(\mathbf{x}) = \lambda(\lambda\mathbf{x}) = \lambda^2\mathbf{x}. \end{align*}
Caveat. The converse does not hold: you can have $\mathbf{x}$ be an eigenvector of $T^2$, but not of $T$: consider the rotation of the real plane by ninety degrees; then ever vector is an eigenvalue of $T^2$, but $T$ has no eigenvectors.
So, consider your $T$. If you apply $T$ twice, you get: $$T(T(A)) = T\left(\frac{1}{2}(A-A^t)\right) = \frac{1}{2}\left(\frac{1}{2}(A-A^t) - \frac{1}{2}(A^t-A)\right) = \frac{1}{2}(A-A^t) = T(A).$$ So if $A$ is an eigenvector if corresponding eigenvalue $\lambda$, then you have that $T(A)=\lambda A$ and $T^2(A)=\lambda^2A$. But since $T(A)=T^2(A)$, then $\lambda A = \lambda^2 A$, hence $(\lambda-\lambda^2)A=\mathbf{0}$. Since $A\neq\mathbf{0}$, then $\lambda=\lambda^2$, so either $\lambda=0$ or $\lambda=1$.
So the only possible eigenvalues of $T$ are $\lambda=0$ and $\lambda=1$. (Note: This is true of any linear transformation $T$ such that $T=T^2$; these are called "projections").
Now that we know the possible eigenvalues, it's a bit easier to check. If $A$ is an eigenvector corresponding to $0$, then we must have $ 0 = \frac{1}{2}(A-A^t)$, so $A=A^t$; that is, $A$ must be equal to its transpose (symmetric)
If $\lambda=1$, then we must have $A = \frac{1}{2}(A-A^t)$, so $A=-A^t$; that is, $A$ must be skew-symmetric.
So this determines all the eigenvectors and eigenvalues of $T$. It is now straightforward to verify that you can find enough linearly independent symmetric and skew-symmetric matrices (to wit, $n^2$ of them) to get a basis for the vector space, so that $T$ is diagonalizable.