Suppose $S$ is a subalgebra of the matrix algebra $M_n(\mathbb{C})$. If for any vector $v$ and $w$ in $\mathbb{C}$, there always exists a matrix $A$ in $S$, depending on $v$ and $w$ of course, which sends $v$ to $w$, then is $S$ necessarily $M_n(\mathbb{C})$?
I arise this technical question when I read Wallach's book Real Reductive Groups, I think the answer is yes, but I am not sure. Can anyone help me prove it or give me a counterexample if I am wrong? Thank you!
The answer is Yes. By Wedderburn-Malcev [http://www.math.uni-bielefeld.de/~sek/select/RF6.pdf] $S=A\oplus J(S)$ as ${\mathbb C}$-vector spaces, where $A$ is a unital semi-simple ${\mathbb C}$-algebra and $J(S)$ is the Jacobson radical of $S$ (the maximal nilpotent $2$-sided ideal of $S$). We claim that $J(S)=0$. Suppose otherwise. Then there exists $j\in J(S)$ and $v\in{\mathbb C}^n$ such that $j(v)\ne0$. By hypothesis there exists $s\in S$ such that $sj(v)=v$. So $(sj)^n(v)=v$ for all $n\geq1$, whence $sj$ is not nilpotent. But this contradicts the fact that $sj\in J(S)$. This proves our claim.
Now $S=A$ is a semi-simple ${\mathbb C}$-algebra. So it is a direct product $\prod_{n_i} M_{n_i}({\mathbb C})$ of matrix algebras, as ${\mathbb C}$ is algebraically closed. With respect to a suitable choice of basis for ${\mathbb C}^n$, we can represent $S$ as block diagonal matrices, with blocks of size $n_1,n_2,\dots$ where $n_1+n_2+\dots=n$. The hypothesis now implies that $n_1=n$ and $S=M_n({\mathbb C})$.