Introduction to Root Systems

152 Views Asked by At

In order to understand Lie Algebras and the Weyl Group, I am learning about root systems. Looking for an intuitive explanation of some parts. From here:

A subset $R$ of a vector space $V$ is called a root system in $V$ if the following conditions are satisfied:

  • $R$ is finite, spans $V$ and does not contain $0$
  • For each $\alpha \in R$, there exists a symmetry $s_\alpha$ with vector $\alpha$ leaving $R$ invariant.
  • For each $\alpha$ and $\beta$ in $R$, the vector $s_\alpha(\beta) − \beta$ is an integer multiple of $\alpha$ (i.e. $\langle\alpha^∨,\beta\rangle \in Z$).

A root system is called reduced if the intersection of $R$ with $\mathbb{R}\alpha$ for $\alpha \in R$ is the set $\{\alpha, −\alpha\}$.

This leads to the main questions/clarifications:

  1. Wondering what the meaning is that it "does not contain $0$", the significance.
  2. Every vector in $R$ has a symmetry which leaves $R$ invariant (remains unchanged under transformation). Not sure what it means that "$R$" is invariant.
  3. The eq ($\langle\alpha^∨,\beta\rangle \in Z$) says something about integrality (wikipedia). Wondering what the significance is.
  4. Every vector in $R$ is a root. Wondering why it is called that.
  5. What the significance of a root system is.

From what I've seen, a span of a set of vectors in a vector space is the intersection of all subspaces containing that set. This definition is a bit confusing. So we have $X$ is the set of all subspaces, and $\bigcap X_i$ is the intersection of all subspaces. But we have $Y$, the set of vectors, already. So $Y = \bigcap X_i$, it seems like a cyclic definition. The other definition is where the span of $Y$ is the set of all finite linear combinations of elements of $Y$ makes a bit more sense:

$$\operatorname {span} (S)=\left\{{\left.\sum _{i=1}^{k}\lambda _{i}v_{i}\right|k\in \mathbb {N} ,v_{i}\in S,\lambda _{i}\in \mathbf {K} }\right\}.$$

1

There are 1 best solutions below

0
On BEST ANSWER
  1. $0$ would not contribute anything, so let's agree to not include it. Having it spoils anything we want to say about possible quotients of root lengths. And of course, there is no such thing as a reflection $s_0$ at the hyperplane orthogonal to $0$

  2. $s_\alpha$ leaves $R$ invariant just mean that $s_\alpha(\beta)\in R$ for every $\beta \in R$.

  3. This condistion makes root systems rather "rigid"

  4. I suppose one should look this up in Killing's original works (where they are called Wurzelsysteme)

  5. Root systems can be used to classify Lie algebras (and some other things). It is quite remarkable in itself that the few rules boil down the possibilities to a few series and a few exceptions

The two definitions of span are equivalent, by the way: Any subspace of the surrounding space that contains all vectors $\in Y$ must necessarily also contain all their finite linear combinations. Hence the intersection of all subspaces containing $Y$ must contain these linear combinatiosn at least. On the other hand, linear combinations form a vector space, hence are among the spaces we intersect.