Stuck on showing a property of $H$ as a representation of $\mathfrak{sl}(2, \mathbb{C})$

74 Views Asked by At

Let $V$ be a finite dimensional representation of $\mathfrak{sl}(2, \mathbb{C})$. Then it has operators $E, F, H$ such that $HE - EH = 2E, HF - FH = -2F, EF - FE = H$. Let $\bar{V}(\lambda)$ represent the generalized eigenspace corresponding to $\lambda$ for some linear map.

I can show the following facts. If $\lambda$ is an eigenvalue of $H$ with maximal real part, then $E$ restricted to $\bar{V}(\lambda)$ is the zero map. If $v \in \ker E$ is non-zero, then $E^k F^k v = P_k(H) v$, where $P_k(H) = k! H (H - 1) \cdots (H - (k - 1))$. If $v \in \bar{V}(\lambda)$ of $H$ for any choice of $\lambda$, then there exists a $N > 0$ such that $F^N v = 0$.

I am next asked to show that $H$ is diagonalizable on $\bar{V}(\lambda)$ (of $H$) by utilizing the fact that $P_k$ does not have repeated roots and computing $E^N F^N v$, where $N$ is picked as defined previously. In particular, I can show that if $v \in \bar{V}(\lambda)$ is non-zero and $(H - a_1) \cdots (H - a_n) v = 0$ and the $a_i$ are pairwise distinct, then there exists an $i$ such that $a_i = \lambda$. EDIT: the $a_i$ do not necessarily have to be distinct, but they are in $P_k(H)$.

I am then not quite sure how to proceed. There is a condition for calculating $E^N F^N v = 0$ with $P_k(H)$, which is that $Ev = 0$, but this is not necessarily true, because $Ev \in \bar{V}(\lambda + 2)$ if $v \in \bar{V}(\lambda)$. I have tried further examining the identity $$ E^k F^k - E^{k - 1} F^k E = k (H - (k - 1)) E^{k - 1} F^{k - 1}, $$ but to no avail.

However, if we can force the evaluation of $E^N F^N v = 0 = P_k(H) v$, then we know that $\lambda$ is one of $0, 1, \dots, N - 1$. Furthermore, since the $H - a_i$ all commute with each other, we can see that $k! [\prod_{i \neq \lambda} (H - i)] (H - \lambda)v = 0$. If $(H - \lambda)v \neq 0$, then one of the $i$ must be equal to $\lambda$, which is obviously false. Thus $v$ is an ordinary eigenvector of $H$, so $H$ is diagonalizable on $\bar{V}(\lambda)$, where it can be represented as $\dim \bar{V}(\lambda)$ entries of $\lambda$ on the main diagonal.

4

There are 4 best solutions below

6
On

It seems you have answered your own question. If I read correctly the only thing you need is that $E^NF^Nv = 0$ but this follows immediately from the result $F^Nv = 0$ you already have.

4
On

Here is a partial answer that I don't have time to finish right now:

First, there exists an $M$ such that $v'=E^Mv\in \bar{V}(\lambda+2M)$ has maximal real part (i.e. $v'\in \ker E$). Now, let $N$ be such that $F^Nv'=0$ and show that $v'$ is an eigenvector for $H$.

Next, prove by induction on $k$ that $F^kv'$ is an eigenvector for $H$. When $k=1$ we have $$HFv'=(FH+[H,F])v'=(\lambda+2(M-1))Fv'$$ and, by induction $HF^kv'=(\lambda+2(M-k))F^kv'$. In particular, $F^Mv'=F^ME^Mv$ is an eigenvector for $H$.

It is left to argue that $F^ME^Mv'$ is a nonzero multiple of $v$.

3
On

For easy readability I post as an answer what I wrote in the comments to David's answer.

  • By Weyl's theorem (Wikipedia) we have that $V = \bigoplus V_i$ with each $V_i$ irreducible. It follows that $v = v_1 \oplus \ldots \oplus v_n$ with each $v_i$ in the corresponding $V_i$. It suffices to show that each $v_i$ is an eigenvector for $H$.

  • Each $V_i$ contains an element $v'_i$ such that $Ev'_i = 0$. By your argument $v'_i$ is an eigenvector for $H$.

  • By irreducibility $v_i$ is $Xv_i'$ for $X$ some complicated sum of products of $E, F$ and $H$. Since we saw in the previous bullet point that $v'$ is an eigenvector for $H$ we find (using the commutation relations) that in fact $Xv_i'$ is a scalar multiple of $F^{k_i}v_i'$ for some integer $k_i$.

  • We thus can apply the argument from David's answer to see that $v_i$ is an eigenvector for $H$ as well.

0
On

Here is an answer without invoking Weyl's theorem or even Schur's lemma, although you might spot parts of them in the background.

I write $U(\mathfrak{g})$ for the universal enveloping algebra of $\mathfrak{sl}_2$ to save 2 characters.

  • Let $\mu_1$ be the eigenvalue of $H$ with largest real part. We have that $Ev' = 0$ for every $v' \in \overline{V}(\mu)$. Let $V_1 = U(\mathfrak{g})\overline{V}(\mu) \subset V$ be the subrepresentation of $V$ generated by $\overline{V}(\mu)$. By the reasoning in your original post and in David's answer we see that $H$ acts diagonally on $V_1$, so also on its generalized $\lambda$-eigenspace. Unfortunately this might be a relatively small subspace of the full generalized $\lambda$-eigenspace in $V$, so we move on to the next step.

  • Lemma. Let $V_1$ be as above and let $v_1 \in V_1$ be any element satisfying $Ev_1 = 0$. Then $v_1 \in \overline{V_1}(\mu_1) = \overline{V}(\mu_1)$.

Proof. $v$ is a sum of $H$-eigenvectors, so if we can show that all those lie in $\overline{V}(\mu_1)$ then so does $v$. Hence it suffices to show the lemma for the case that $v$ lies at least in some $\overline{V}(\mu)$

Let $\omega = \frac{H^2}{4} + \frac{EF + FE}{2} \in U(\mathfrak{g})$ be the Casimir. We can check or remember from the literature that $\omega$ commutes with all elements of $U(\mathfrak{g})$. Rewriting $\omega$ in the form $H^2/4 - H/2 + EF$ we see that $\omega v_0 = \alpha v_0$ for all $v_0 \in \overline{V}(\mu_1)$, where $\alpha$ is some explicitly computable expression in $\mu_1$. (The computation is not hard, but I am scared to make sign errors so I won't do it.) Now since every $v \in V_1$ is of the form $\sum X_iv_i$ with $X_i \in U(\mathfrak{g})$ and $v_i \in \overline{V}(\mu_1)$ it follows that $\omega v = \sum \omega X_i v_i = \sum X_i \omega v_i = \alpha v$ for all $v \in V_1$. So $\omega$ acts diagonally on all of $V_1$.

Now if $v_2 \in V_1(\mu_2)$ is some $H$-eigenvector satisfying $Fv_2 = 0$ the equation $\alpha v_2 = \omega v_2 = (H^2/4 - H/2 + EF)v_2 = (H^2/4 - H/2)v_2$ shows that there are at most two possibilities for $\mu_2$.

Actually doing the computation shows that one of these possibilities is $\mu_1 + 2$ and be definition of $V_1$ we have that $\overline{V_1}(\mu_1 + 2) = \{0\}$. It follows that in reality there is only one possibility for $\mu_2$. (It can be expressed in terms of $\mu_1$ but I will just call it $\mu_2$ because I am afraid of making errors.) The upshot is that any $H$-eigenvector $v_2 \in V_1$ satisfying $Fv_2 = 0$ must live in $\overline{V}(\mu_2)$. This is close to what we want to prove, but with $F$ in the role of $E$.

Now back to vector $v$ from the lemma, i.e. a vector $v \in V_1$ satisfying $Hv = \mu v$ and $Ev = 0$. Using similar reasoning as above but exploiting the alternative expression $\omega = H^2/4 + H/2 + FE$ for the Casimir we conclude that there are at most two possible values of $\mu$. One of them is of course $\mu_1$. We want to show that this is the only possibility so we have to rule out the other one. Doing the computation we see that the other one equals $\mu_2 - 2$. Now if $\mu = \mu_2 - 2$ the vectors $F^k v$ with $k = 1, 2, 3, \ldots$ are of the form $Fw$ with $w$ an $H$-eigenvector with eigenvalue smaller than $\mu_2$ so that $Fw \neq 0$ by the above. It would follow that $V$ is infinite dimensional, which it is not. It follows that $\mu \neq \mu_2 - 2$ and hence $\mu = \mu_1$ as we wanted to show.

  • Having proven the lemma we move on with the answer.

  • Let $\mu_3$ be the $H$-eigenvalue with largest real part in the quotient representation $V/V_1$. It follows that for every $[v] \in \overline{(V/V_1)}(\mu_2)$ we have that $E[v] = 0 \in V/V_1$. The brackets are added to indicate that elements of $V/V_1$ are equivalence classes of elements of $V$. In particular the statement $E[v] = 0$ means that for every representative $v \in V$ of the class $[v]$ we have that $Ev \in V_1$. Also, again using your earlier result we have that $(H - \mu_3)[v] = 0$, meaning that $(H - \mu_3)v \in V_1$ for all representatives $v$ of $[v]$. Since $Ev \in V_1$ we have that $HEv = (\mu_3 + 2)Ev$ and hence $E(H - \mu_3)v = (EH)v - \mu_3Ev = (HE -2E)v - \mu_3Ev = (H - 2)Ev - \mu_3Ev = \mu_3Ev - \mu_3Ev = 0$

It follows that $(H - \mu_3)v$ satisfies the conditions of the lemma and must hence live in $\overline{V}(\mu_1)$. But it already lived in the space $\overline{V}(\mu_3)$. By construction $\Re(\mu_3) < \Re(\mu_1)$ so $\mu_3 \neq \mu_1$. We conclude that $(H - \mu_3)v = 0$. In other words: every element in $\overline{V}(\mu_3)$ is an $H$-eigenvector.

  • Now it is not hard to see from there that $H$ acts diagonally on the entire subrepresentation $U(\mathfrak{g})(\overline{V}(\mu_3))$ and hence, more interesting, in the subrepresentation $V_2 = U(\mathfrak{g})(\overline{V}(\mu_3) \oplus \overline{V}(\mu_1))$. You can see where this is going: we take the eigenvalue with largest real part of $V/V_2$, use reasoning like the above to obtain an even bigger subrepresentation $V_3$ on which $H$ acts diagonally and so on and so on, until we get to a $V_n$ that encompasses all of $V$. The only thing missing here is a version of the lemma that works for $V_i, i > 1$. We present that here, I trust that you can finish the proof from there.

Lemma: let $n \geq 1$. Let $\alpha_1, \ldots, \alpha n$ be a set of eigenvalues of the $H$-action on $V$ with $\Re(\alpha_1) > \ldots > \Re(\alpha_n)$ and let $V_n = U(\mathfrak{g})(\bigoplus_{i=1}^n \overline{V}(\alpha_i))$. Let $\beta$ be a complex number with $\Re(\beta) < \alpha_n$ and let $v \in V_n$ be a vector satisfying $v \in \overline{V}(\beta)$ and $Ev = 0$. Then $v = 0$.

Proof: We proceed by induction on $n$. The $(n = 1)$-case follows from the previous lemma. For $n > 1$ we consider the subrepresentations $V_{n-1} = U(\mathfrak{g})(\bigoplus_{i=1}^{n-1} \overline{V}(\alpha_i))$ and $W_n = U(\mathfrak{g})(\overline{V}(\alpha_n))$. It is clear from the definition that we can write $v = v_{n-1} + w$ with $v_{n-1} \in V_{n-1}, w \in W_n$ and both being $H$-eigenvectors at eigenvalue $\beta$, although we will not assume that this decomposition is unique. By definition of $v$ we have that $Ew = -Ev_{n-1}$. Let $\phi: V_n \to V_n/V_{n-1}$ be the quotient map. Let $W' = \phi(W_n)$. We have that $\phi(w) \in W'$ is an $H$-eigenvector with eigenvalue $\beta$ with $\Re(\beta) < \Re(\alpha_n)$ and from $Ew = -Ev_{n-1}$ we have that $E\phi(w) = 0 \in V_n/V_{n-1}$. Now we can apply the first lemma, with $V_n/V_{n-1}$ in the role of $V$ and $W'$ in the role of $V_1$ to conclude that $\phi(w) = 0$. It follows that $w \in V_{n-1}$ and hence that $v \in V_{n-1}$. But then we can apply the induction hypothesis to conclude that $v = 0$.