Existence of integrals in f.d Hopf algebras

149 Views Asked by At

In THE HAAR MEASURE ON FINITE QUANTUM GROUPS, van Daele gives an implausibly short proof of the existence of integrals in a finite-dimensional Hopf algebra.

I'm probably overlooking something obvious, but this proof feels very esoteric to me: I cannot draw it in pictures.

So:

Let $H$ be a f.d. Hopf-algebra. Then there exists a non-zero $t\in H$ s.t. for all $h\in H$ we have $h.t=\epsilon(h)t$.

Proof. Let $\{e_i\}$ be a basis of $H$ with corresponding dual basis $\{e^j\}$. Take any non-zero $b\in H$, and set $$ t = \sum_{i,(e_i)}\langle e^i, S^2(e_i'') b\rangle e_i'\ , $$ where $\langle f,h\rangle = f(h)$ is the pairing. Then, for any $h\in H$ we have \begin{align} \epsilon(h)t &~=~\sum_{i,(e_i),(h)}\langle e^i, h'S(h'')S^2(e_i'') b\rangle~ e_i'\\[1em] \tag{$\star$}&~=~\sum_{j,i,(e_i),(h)}\langle e^i, h' e_j\rangle~\langle e^j,S(h'')S^2(e_i'') b\rangle~ e_i'\\[1em] \tag{$\maltese$}&~=~\sum_{i,(e_i),(h)} \langle e^i, S(h''')S^2(h''e_i'') b \rangle~ h'e_i' \\[1em] \tag{$\spadesuit$}&~=~h \sum_{i,(e_i)} \langle e^i, S^2(e_i'')b\rangle ~e_i' \\ &~=~ ht \end{align}

The first equality is obvious (and of course the last one too). Next, $\star$ is probably something like inserting $1$ or $\operatorname{id}_H$ (which is given by $\operatorname{coev}_H(1)$), but I'm not sure?

But I don't even have a guess for $\maltese$ and $\spadesuit$.

For example, in $\maltese$, why is $h'$ suddenly not paired anymore, and where does the second antipode come from that acts on $h''$?

And $\spadesuit$ at least superficially seems to be using the definition of the antipode, but I don't think it actually does.

Please end my suffering.


The following is a failed attempt. The following is true: Assume for a second that the pairing is actually symmetric. Then \begin{align} &\sum_{i,(e_i),(h)} \langle e^i, S(h''')S^2(h''e_i'') b \rangle~ h'e_i' \\[1em] ~=~&\sum_{i,(e_i),(h)} \langle e^i, S(h''')S^2(e_i'')S^2(h'') b \rangle~ h'e_i' \\[1em] ~=~&\sum_{i,(e_i),(h)} \langle e^i, S(h'''S(h''))S^2(e_i'') b \rangle~ h'e_i' \\[1em] ~=~&\sum_{i,(e_i),(h)} \langle e^i, S^2(e_i'') b \rangle~ \epsilon({h''}) h'e_i' \\[1em] ~=~&h\sum_{i,(e_i)} \langle e^i, S^2(e_i'') b \rangle~ e_i' \ . \end{align} But I think the pairing is only symmetric if $H$ is commutative. So this is superfluous. Meh.


Nope. The solution to $\spadesuit$ is just \begin{align} S(h'')S^2(h' a) &= S(S(h'a)h'') = S(S(a)S(h')h'') = \epsilon(h)S(a)\ , \end{align} isn't it?

3

There are 3 best solutions below

0
On

This doesn't answer the question, I think, but it gives an alternative proof which convinces me of the truth of the statement. (I'm still interested in understanding the original proof)

The argument can be seen from the following picture:

pictorial proof

This is for the case $b=1$, as user m_t_ suggested. If $b$ is any other non-zero element then the proof will be completely analogous, because $b$ is attached on the right and above the two antipodes, so nothing will get in each other's way.

The steps are:

  • coevaluation is an intertwiner
  • comultiplication is an intertwiner
  • antipode is an antihomomorphism, use twice
  • the (left) action on the dual space is $\langle a, h\rightharpoonup f \rangle \equiv \langle S(h)\rightharpoonup a, f\rangle$
  • antipode axiom
0
On

(There is some portion of copy+paste, only to have the full framework.)

Recall: Let $\{e_i\}$ be a basis of $H$ with corresponding dual basis $\{e^i\}$. In particular, we have for the identity $\operatorname{id}\in\operatorname{Hom}(H,H)\cong H^*\otimes H$ the corresponding writing $\operatorname{id}\leftrightarrow \sum e^i\otimes e_i$. So we can write any element in $H$ in the form $$ h = \operatorname{id}(h) =\sum e^i(h)\;e_i =\sum \langle e^i,h\rangle\;e_i \ . $$ Also, we have with appropiate notations: $$ \begin{aligned} \langle e^i, hk\rangle &= \langle e^i, h\;\operatorname{id}(k)\rangle \\ &= \left\langle e^i, h\; \sum\langle e^j, k\rangle e_j \right\rangle \\ &= \sum \langle e^i, h e_j\rangle\; \langle e^j, k\rangle \qquad (*) \ . \end{aligned} $$ Also, for some (linear) $\phi:H\to H$ we have $$ \phi (h) = \phi h =\phi \sum \langle e^i,h\rangle\;e_i =\sum \langle e^i,h\rangle\;\phi(e_i) \ . \qquad(\diamond) $$

Take now any non-zero $b\in H$, and set $$ t = t(b) = \sum_{i,(e_i)}\langle e^i, S^2(e_i'') b\rangle e_i'\ , $$

Now we start the...

Long computation: The sums will have appropriate (in part conventional, Hopf algebraic) indices of summation. For any $h\in H$ we have then: $$ \begin{aligned} \epsilon(h)\;t(b) &= \epsilon(h)\sum\langle e^i, S^2(e_i'') b\rangle e_i' \\ &= \sum\langle e^i, \epsilon(h)\; S^2(e_i'') b\rangle e_i' \\ &= \sum\langle e^i, (\operatorname{id}*S)(h)\; S^2(e_i'') b\rangle e_i' \\ & = \sum\langle e^i, h'S(h'')\; S^2(e_i'') b\rangle\; e_i' \\ & = \sum\langle e^i, h'\;\operatorname{id}(S(h'')\; S^2(e_i'') b)\rangle\; e_i' \\ &\qquad\text{as in $(*)$} \\ & = \sum\langle e^i, h'e_j\rangle\;\underbrace{\langle e^j,\ S(h'')\; S^2(e_i'') b\rangle\; e_i'}_{\phi_j(e_i)} \\ &\qquad\text{as in $(\diamond)$ with the linear function $\phi_j=\phi_j(\ \cdot\ ,h'')$ given by} \\ &\qquad\text{$\phi_j(x)=\langle e^j, \ S(h'')\; S^2(x'') b\rangle\; x'\rangle$} \\ &=\sum \phi_j(h'e_j,h'') \\ &=\sum \langle e^j,\ S(h'')\; S^2((h'e_j)'') b\rangle\; (h'e_j)' \\ &\qquad\text{... now we have to redistribute the parts. I did not check, but it is plausible...} \\ &=\sum \langle e^j,\ S(h''')\; S^2(h''e_j'') b\rangle\; h'e_j' \\ &=\sum \langle e^j,\ S(h''')\; S^2(h'')S^2(e_j'') b\rangle\; h'e_j' \\ &\qquad\text{$S$ antimorphism, so $S^2$ morphism, as in your failed attempt,} \\ &\qquad\text{(Just correct it at the one place!)} \\ &=\sum \langle e^j,\ S(S(h'')h''')S^2(e_j'') b\rangle\; h'e_j' \\ &=\sum \langle e^j,\ S((S*\operatorname{id})h'')S^2(e_j'') b\rangle\; h'e_j' \\ &=\sum \langle e^j,\ S(\epsilon(h''))S^2(e_j'') b\rangle\; h'e_j' \\ &=\sum \langle e^j,\ \epsilon(h'')S^2(e_j'') b\rangle\; h'e_j' \\ &=\sum \langle e^j,\ S^2(e_j'') b\rangle\; \epsilon(h'')h'e_j' \\ &=\sum \langle e^j,\ S^2(e_j'') b\rangle\; he_j' \\ &=h\sum \langle e^j,\ S^2(e_j'') b\rangle\; e_j' \\ &= h\; t(b)\ . \end{aligned} $$ (There is only one place, where we have to draw diagrams and rearrange using the fact that $m$ is comultiplicative and/or $\Delta$ is multiplicative. It is the place marked as "plausible". The notation is also fuzzy for my taste. But i hope i could give a hint of what is going on, seen from my perspective. It is a pity, that in this era of electronic publications authors do not provide full proofs, with full details. It took me some days to exhibit the arguments... Sorry, have to submit, back to work.)

0
On

Since it seems that more or less you figure out the solution, an answer is not needed anymore, but please let me take Van Daele's part by showing why things are easier than you think. Let $H$ be finite-dimensional with finite dual basis $\{e_i,e^i\}$. I will adopt Sweedler notation in the form $$\Delta(x) = \sum x'\otimes x''$$ as you did. By definition, $t=\sum e^i\left(S^2(e_i'')b\right)e_i'$. Notice the following facts \begin{equation}\label{eq:1}\tag{$\dagger$} \sum e^i(h)e_i'\otimes e_i'' = \sum h'\otimes h'' \end{equation} for all $h\in H$ since $h=\sum e^i(h)e_i$ and \begin{equation}\label{eq:2}\tag{$\ddagger$} \sum S(h'')S^2(h'k) = \sum S(h'')S\left(S(h'k)\right) = \sum S\left(S(k)S(h')h''\right) = S^2(k)\varepsilon(h) \end{equation} for all $h,k\in H$. Now \begin{align*} \varepsilon(h)t & = \sum e^i\left(\varepsilon(h)S^2(e_i'')b\right)e_i' \\ & = \sum e^i\left(h'S(h'')S^2(e_i'')b\right)e_i' \\ & = \sum e^i\left(h'e_je^j\left(S(h'')S^2(e_i'')b\right)\right)e_i' \\ & = \sum e^i\left(h'e_j\right)e^j\left(S(h'')S^2(e_i'')b\right)e_i' \\ & \stackrel{\eqref{eq:1}}{=} \sum e^j\left(S(h''')S^2(h''e_j'')b\right)h'e_j' \\ & \stackrel{\eqref{eq:2}}{=} \sum e^j\left(S^2(e_j'')b\right)h'\varepsilon(h'')e_j' \\ & = h\left(\sum e^j\left(S^2(e_j'')b\right)e_j'\right) = ht. \end{align*} Thus, van Daele's proof is indeed correct and indeed short, once you have a bit of practice in Sweedler's notation and finite-dimensional computations.