I have a good reason to believe that the following result is true, but can someone help me come up with an argument?
Let $K$ be an algebraically closed field (of characteristic zero, if necessary), and let $A = M_n(K)$ be the $n\times n$ matrix algebra. Assume that $x\mapsto x^*$ be an involution on $A$, and define $$ S(A) = \{x\in A\mid x^* = -x\} $$ to be the set of skew elements in $A$. Prove that $S(A)$ generates $A$ as an algebra (with unit) over $K$.
EDIT: By an involution I mean a ring antiautomorphism that is its own inverse. So $(xy)^* = y^* x^*$ for all $x,y$, $(x+y)^* = x^* + y^*$, and $(x^*)^* = x$.
EDIT: You may assume that $x\mapsto x^*$ is $K$-linear, for else the result follows like this: Let $$ F(A) = \{x\in A\mid x^* = x\} $$ be the fixed elements in $A$. If $x\mapsto x^*$ is not linear, it is antilinear with $(\alpha x)^* = \overline\alpha x^*$ for $\alpha\in K,x\in A$, and where $\alpha\mapsto\overline\alpha$ is an involution in $K$. Define $F(K)$ and $S(K)$ similarly to be the set of fixed and skew elements by $\alpha\mapsto\overline\alpha$. $L = F(K)$ is a field, and $F(A), S(A)$ are $L$-subspaces of $A$ with $A = F(A)\oplus S(A)$. Since $\alpha\mapsto\overline\alpha$ is not the identity, and since we can do a similar decomposition $K = F(K)\oplus S(K)$, a direct sum of $L$-subspaces, we may find $\alpha\in S(K),\alpha\neq 0$. Then $x\mapsto\alpha x$ is an $L$-linear map, and $\alpha S(A)\subset F(A)$ and $\alpha F(A)\subset S(A)$. It follows that $x\mapsto\alpha x$ is a bijection, so $\alpha S(A) = F(A)$. Because $A = S(A)\oplus F(A)$, we can generate all of $A$ from $S(A)$.
Let $e_{i,j}$ denote the standard basis matrices of $M_n$ (i.e., $e_{i,j}$ as an entry $1$ at position $(i,j)$ and is zero otherwise).
Because $f(x)^*=f(x^*)$ for any polynomial $f\in K[X]$, the matrices $x^*$ and $x$ have the same minimal polynomial.
Let $B$ be the set of matrices with minimal polynomial $X^n$, which are precisely those similar to the matrix $\sum_{k=1}^{n-1}e_{i,i+1}$. Then $\operatorname{rk}(x)=\min\{\,k\in \mathbb N_0\mid\exists b\in B\colon b^kx=0\,\}$. We conclude that $\operatorname{rk}(x^*)=\operatorname{rk} (x)$.
If $x$ is semisimple then the multiplicity of an eigenvalue $\lambda\in K$ can be read from $\operatorname{rk}(x-\lambda)$. We conclde that $x^*$ is similar to $x$ for semisimple $x$. Similarly, the block structure of a nilpotent $x$ matrix can be read from ranks of powers $x^k$. Finally, in the Jordan-Chevalley decomposition $x=x_s+x_n$ with $x_s$ semisimple, $x_n$ nilpotent, $x_sx_n=x_nx_s$, we can read the complete Jordan block structure from the $\operatorname{rk}((x-\lambda)^k)$. Hence ultimately $x$ and $x^*$ have the same Jordan block structure, and so they are similar.
Let $x_s=\sum_{k=1}^nke_{k,k}$ so that we recognize the standard basis as the (up to a factor) unique eigenvectors of $x$. Then $x_s^*=yxy^{-1}$ gives us also a basis of eigenvectors, unique up to scalars, i.e. the similarity matrix $y$ is determined up to right multiplication with an invertible diagonal matrix. By comparing with eigenvectors for $x_s+ne_{k,k}$, we conclude that the same similarity matrix $y$ works for all diagonal matrices and especially $e_{k,k}^*=ye_{k,k}y^{-1}$. Now for $i\ne j$, the only $k$ with $e_{k,k}e_{i,j}\ne0$ is $k=i$. Hence the only $k$ with $e_{i,j}^*e_{k,k}^*\ne0$ is $k=i$. But $e_{i,j}^*e_{k,k}*=e_{i,j}^*ye_{k,k}y^{-1}$ so that $y^{-1}e_{i,j}^*y$ is zero except on column $j$. Similarly, we show that $y^{-1}e_{i,j}^*y$ is zero except on column $i$. Thus $e_{i,j}^*=\beta_{i,j}ye_{i,j}y^{-1}$ for suitable $\beta_{i,j}\in K$. We replace $y$ with $yd$ where $d$ is a diagonal matrix that turns $\beta_{i,i+1}$ into $1$. This is possible because $d^{-1}y^{-1}e_{i,i+1}^*yd=\beta_{i,i+1}d^{-1}e_{i,i+1}d=\frac{d_{i+1,i+1}\beta_{i,j}}{d_{i,i}}e_{i,i+1}$ so that we can let $d_{i+1,i+1}=\frac{\beta_{i,i+1}}{d_{i,i}}$, starting with $d_{1,1}=1$. By considering $e_{i,i+1}e_{i+1,i}$ we conlcude that this also changes all $\beta_{i+1,i}$ to $1$. So by now we have a common similarity matrix by which all diagonal and near-diagonal elements are similar to their starred images. As all other $e_{i,j}$ are products of such near-diagonal elements, the same similarity matrix works for them as well. We conclude that there exists an invertible matrix $y$ such that $ x^*=yxy^{-1}$ for all $x$. Now find $z$ with $z^2=y$ and we see that the star operator is in fact just transposition with respect to another basis! The problem is this reduced to a known case.