Prove the following proposition.
If square matrix $X$ has $n$ eigenvalues $\{\lambda_j|j=1,2,\dots,n\}$, then $\operatorname{ad}X (\operatorname{ad}X(Y):=[X,Y]:=XY-YX,\,\forall$ square matrix $Y$ of the same size as $X$) has $n^2$ eigenvalues $\{\lambda_j-\lambda_k|j,k=1,2,\dots,n\}$.
We can prove $\lambda_j-\lambda_k$ is necessarily an eigenvalue of $\operatorname{ad}X$ by picking $Y:=u_j\otimes v_k$ where $u_j$ and $v_k$ are the eigenvectors of $X$ and $X^\dagger$ respectively, such as done in this proof. We need to prove these are all the eigenvalues there are for $\operatorname{ad}X$ as well as their multiplicities.
Here is a proof I do not quite understand. Presumably $M$ in the proof is the operator or its corresponding matrix.
- Is $E_{jk}:=u_j\otimes v_k$ where $u_j$ and $v_k$ are the basis vectors of $X$ and $X^\dagger$?
- Are we stacking the columns of a matrix into one column matrix (vector) and concatenate all the thus generated large column matrices into a large square matrix? Do we speak of triangle in that matrix? How exactly?

Here is the detail of the proof.
In general, a matrix $A$ of a linear transformation $L$ from $M$ to $M$ with basis $\{v_\mu|\mu\in I\}$ for some index set $I$. $Lv_\mu=\sum_{\nu\in I} v_\nu A_{\nu,\mu}$ where $A_{\nu,\mu}$ is the entry of $A$.
Assume $X$ is without loss of generality a lower triangular matrix. This can be achieved via the Schur triangulation or the Jordan canonical form. Now apply the above general formulation to the current problem of $\operatorname{ad}X$. Mapping this problem to the general form described in the second paragraph, we let $L=\operatorname{ad}X, I=\{(i,j), i,j\in\{1,2,\dots,n\}\},\mu:=(j,k)$ and $v_\mu=v_{(j,k)}=E^{j,k},\,v_\nu=v_{(i,m)}=E^{i,m}$. Define order $<$ on $I$ such that $(a,b)<(c,d)$ if $a-b<c-d$. $$A_{\nu,\mu}=A_{(i,m),(j,k)}=\sum_{l=1}^n\big(X_{i,l}E^{j,k}_{l,m}-E^{j,k}_{i,l}X_{l,m}\big) =X_{i,j}\delta_{k,m}-\delta_{i,j}X_{k,m}.$$ $A_{\nu,\mu}=A_{(i,m),(j,k)}\neq0 \implies i>j \wedge k>m\implies i-m>j-k \implies (i,m)>(j,k)$. So $A$ is lower triangular and $A_{\mu,\mu}=A_{(j,k),(j,k)}=X_{j,j}-X_{k,k}$ is an eigenvalue.