I'm given a representation $\Pi : \mathrm{Gl}(n,\mathbb{C}) \rightarrow \mathrm{Aut}(\mathrm{Mat}_{n\times n}(\mathbb{C}))$ by $\Pi(g) = gXg^T$ (does it have a name?). Then the representations $\mathrm{Sym}^2$ and $\Lambda^2$ defined in the usual way - by restricting $\Pi$ to the set of symmetric and anti-symmetric matrices, respectively.
I need to prove that each one of those representations are irreducible.
I started proving by contradiction, trying to show that each base vector will be in the subspace. So, for the symmetric representation I tried using the property of orthogonal diagonalization of symmetric matrices (for each symmetric $X$ there are $Q$ - orthogonal and $D$ diagonal, such that $X = QDQ^T$), but that path only good for real symmetric matrices, and therefore, in my case, requires to handle both the real and imaginary parts.
I'm almost sure there is a proof, that requires less technique, but I'm having a hard time finding it. Also I have pretty basic knowledge of representation theory, so maybe I'm missing something.
I'll appreciate any clue.
Thanks in advance!
I think your idea works without too much difficulty. Consider $Sym^2$ first. I'll show that any symmetric matrix is in the subrep $V$ containing the identity matrix $I$. Acting on $I$ by elements of the form $g = \operatorname{diag}(\sqrt a_1, \ldots, \sqrt a_n)$ shows that any diagonal matrix is in $V$. Now given any $Z = X + i Y$ symmetric, both $X$ and $Y$ are symmetric. Then by the spectral theorem both $X$ and $Y$ are diagonalizable by symmetric matrices so that $X,Y \in V$. Thus $Z = X+iY \in V$.
For anti-symmetric, a similar argument probably works using this decomposition: http://en.wikipedia.org/wiki/Skew-symmetric_matrix#Spectral_theory .