Is there a consistent way to get all possible coproducts?

142 Views Asked by At

Let's illustrate the problem with an example. Consider an algebra of polynomials in one variable $1,x,x^2,\ldots$ with the product $\nabla (x^i,x^j) = x^{i+j}$. Then, reversing arrows in the diagram

$\nabla \circ \nabla\otimes id = \nabla \circ id \otimes\nabla$

one can easily find one coproduct $$\Delta(x^i) = \sum_{j=0}^i x^j\otimes x^{i-j} \quad (1) $$

To be honest, I don't understand how one would prove that there are no other coproducts (are there?). And when I have, say, a quantum group $U_q(sl_2)$ the situation looks even more scarier.

The only idea that I have is that in the case of polynomials I should write $ \Delta(x^i) = \sum_{jk}\alpha^i_{jk}x^j\otimes x^k$ and solve the condition $\sum_{jk}\sum_{rs} \alpha^i_{jk} \alpha^j_{rs} x^r\otimes x^s\otimes x^k = \sum_{jk}\sum_{rs} \alpha^i_{jk} \alpha^k_{rs} x^j\otimes x^r\otimes x^s$ which is equivalent to $\sum_k\alpha^i_{jk}\alpha^k_{rs} = \sum_k\alpha^i_{ks}\alpha^k_{jr}$, which, I guess has more soltions than just (1)

1

There are 1 best solutions below

0
On BEST ANSWER

The comultiplication $\Delta : A \to A \otimes A$ in a bialgebra is an algebra homomorphism (with respect to the multiplication), so a comultiplication on the polynomial algebra $k[x]$ is completely determined by $\Delta(x)$; e.g. $\Delta(x^n) = \Delta(x)^n$ (so you can check that the one you wrote down doesn't have this property). Similarly for the counit and the antipode on a Hopf algebra.

There are two standard and important choices of a comultiplication on $k[x]$, "plus"

$$\Delta_a(x) = 1 \otimes x + x \otimes 1$$

(with counit $\varepsilon(x) = 0$) making $x$ primitive, and "times"

$$\Delta_m(x) = x \otimes x$$

(with counit $\varepsilon(x) = 1$) making $x$ grouplike. Only “plus” has an antipode (given by $S(x) = -x$), making it a Hopf algebra, but after inverting $x$, the Laurent polynomial ring $k[x, x^{-1}]$ with “times” has an antipode given by $S(x) = x^{-1}$.

Each of these comultiplications admits two descriptions, a "commutative" one and a "cocommutative" one. The "cocommutative" descriptions are that

  1. $k[x]$ with "plus" is the universal enveloping algebra of the abelian Lie algebra $k$. One way of saying what this means is that $k[x]$ is the free Hopf algebra on a primitive element; that is, morphisms of Hopf algebras out of $k[x]$ into another Hopf algebra $H$ correspond to primitive elements in $H$, which canonically form a Lie algebra under commutator (and taking the universal enveloping algebra is left adjoint to this construction).

  2. $k[x, x^{-1}]$ with "times" is the group algebra of $\mathbb{Z}$. One way of saying what this means is that $k[x, x^{-1}]$ is the free Hopf algebra on a grouplike element; that is, morphisms of Hopf algebras out of $k[x, x^{-1}]$ into another Hopf algebra $H$ correspond to grouplike elements in $H$, which canonically form a group under product (and taking the group algebra is left adjoint to this construction).

The "commutative" descriptions take a little more setup. Concretely they can be thought of as coming from thinking of $k[x] \otimes k[x]$ as the polynomial algebra $k[x, y]$ in two variables, and thinking of a Hopf algebra comultiplication on $k[x]$ as a polynomial $\Delta(x) = f(x, y)$ in two variables with the properties that

  • $f$ is associative: $f(f(x, y), z) = f(x, f(y, z))$;
  • $f$ has a unit: there is some constant $e \in k$ such that $f(e, x) = f(x, e) = x$;
  • $f$ has inverses; there is some polynomial $S(x)$ such that $f(S(x), x) = f(x, S(x)) = e$.

I find this a lot easier to think about than the abstract definition of a coproduct. This says that $f$ is a "polynomial group law," by analogy with formal group laws. Abstractly this says that $f$ defines an affine group scheme structure on the affine line $\text{Spec } k[x] \cong \mathbb{A}^1$ (ignore this if you don't know what it means).

In these terms, "plus" is just ordinary addition as a group law $f(x, y) = x + y$, and "times" is just ordinary multiplication as a group law $f(x, y) = xy$ (although again to get inverses / an antipode we need to invert $x$). Your question can be interpreted as asking:

How can we classify all polynomial group laws $f$?

For starters, by translating $x$ as necessary we can assume WLOG that the unit is $e = 0$, which is equivalent to asking that $f(x, y)$ has no constant term and that it begins

$$f(x, y) = x + y + xy (\text{higher order terms}).$$

What we'll try to do from here is to show that there can't be any higher order terms if $f$ is going to satisfy associativity. The idea is that it'll be too hard for the really high order terms of $f(f(x, y), z)$ and $f(x, f(y, z))$ to cancel.

Formally, consider lex order on the monomials in $k[x, y]$ and $k[x, y, z]$: we consider a monomial to be $\ge$ another monomial if the exponent of $x$ is greater, or the exponent of $x$ is equal and the exponent of $y$ is greater, or the exponents of $x$ and $y$ are equal and the exponent of $z$ is greater. For example $x^6 y \ge x^5 y^2 \ge x^5 y$. If you like, you can think of $x$ as infinitely large compared to $y$ which is infinitely large compared to $z$, or imagine that $x$ is growing much faster than $y$ which is growing much faster than $z$.

Suppose $x^n y^m$ is the biggest term in $f(x, y)$, with some coefficient that will not matter (here we'll need to start assuming that $k$ is a reduced ring). Then the biggest term in $f(f(x, y), z)$ is the biggest term in $f(x, y)^n z^m$, which is

$$x^{n^2} y^{nm} z^m.$$

Similarly the biggest term in $f(x, f(y, z))$ is the biggest term in $x^n f(y, z)^m$, which is

$$x^n y^{nm} z^{m^2}.$$

So for $f$ to be associative these need to be the same. But this requires that $n^2 = n$ and $m^2 = m$, so $n$ and $m$ must both be equal to either $0$ or $1$. This says that $f$ must have the form

$$f(x, y) = x + y + cxy$$

for some constant $c$. Now it's actually possible for such an $f$ to be associative, say if $c = 1$ (this is just “times” in disguise), but we'll show that it can't have inverses. If $S(x)$ is an inverse polynomial, then in order for $e = 0$ to be the identity we need $S(e) = e$, so $S(0) = 0$, so $S$ also has no constant term. Then

$$f(x, S(x)) = x + S(x) + cx S(x) = 0$$

consider the linear term on both sides gives $S(x) = -x + \text{higher order terms}$ but then considering the quadratic term on both sides gives $c = 0$. So in fact, up to translating the unit around, the additive group law

$$f(x, y) = x + y$$

is the unique polynomial group law, and so, up to translation, is the unique comultiplication on $k[x]$ making it a Hopf algebra.

This straightforward argument happened to work because $k[x]$ is a very easy ring to understand; in general there are many interesting affine group schemes, reflecting the existence of interesting comultiplications on many commutative algebras. For example there is an affine group scheme $GL_n$ which as a Hopf algebra has underlying algebra

$$k[x_{ij}, 1 \le i, j \le n][\det(x_{ij})^{-1}]$$

with comultiplication coming from writing out matrix multiplication in coordinates. Deforming group schemes like these gives certain kinds of quantum groups.