Simple root systems are contained in a half-space

96 Views Asked by At

Let $V$ be a Euclidean space, that is, a finite dimensional real linear space with a symmetric positive definite inner product $\langle \cdot, \cdot\rangle$.

Definition: An (abstract) root system in $V$ is a finite set Δ of nonzero elements of V such that

(1) $\Delta$ spans $V$;

(2) for all $\alpha \in \Delta$, the reflections $$s_\alpha:\beta \mapsto \beta- \frac{2 \langle \beta, \alpha\rangle}{\langle \alpha, \alpha \rangle} \alpha$$ map the set $\Delta$ to itself;

(3) the number $\frac{2 \langle \beta, \alpha\rangle}{\langle \alpha, \alpha \rangle}$ is an integer for any $\alpha,\beta \in \Delta$.

The elements of $\Delta$ are called roots.

Given a root system $\Delta$ we make another definition.

Definition: A subset $\Pi$ of $\Delta$ is a set of simple roots (a simple root system) in $\Delta$ if

(1) $\Pi$ is a basis for $V$;

(2) Each root $\beta \in \Delta$ can be written as a linear combination of the elements of $\Pi$ with integer coefficients of the same sign, that is, $$\beta=\sum_{\alpha \in \Pi} m_\alpha \alpha $$ with all $m_\alpha \geq 0$ or all $m_\alpha \leq 0$.

I want to prove that given any root system $\Delta$, and a simple root system $\Pi \subset \Delta$, $\Pi$ is contained in some half-space. That is there exists some $t \in V$ such that $\langle t,\alpha \rangle >0$ for all $\alpha \in \Pi$.

I tried constructing such a $t$ explicitly, as the sum $$t:=\sum_{\alpha \in \Pi} \alpha ,$$ or as the weighted average $$t:= \frac{ \sum_{\alpha \in \Pi} \| \alpha \| \alpha}{\sum_{\alpha \in \Pi} \| \alpha \|}, $$ but I couldn't prove that it has a positive inner product with all elements of $\Pi$. How can one show that such a $t$ exists? Thank you!

1

There are 1 best solutions below

0
On BEST ANSWER

If I understand you correctly, the question is not specific to root systems and is just to show that any $n$ linearly independent vectors in $n$-dimensional Euclidean space can be put on one side of a hyperplane.

Rather than try to give a cute explicit construction of $t$, we can just build a system of equations and solve for it.

Let the inner product be given by $\langle x, y \rangle = x^T A y$ for some positive definite symmetric $A$.

If $\Pi = (\alpha_1, \ldots, \alpha_n)$ is a basis for Euclidean space, let $B$ be the corresponding basis matrix, with $i$th column $\alpha_i$.

We want to find a hyperplane passing through the origin with all of those points on one side, i.e. a vector $t$ such that $\langle\alpha_i, t \rangle = s$ for all $\alpha_i$.

Letting $\mathbb{1}$ be the all $1$s vectors, we collect these equations into a single matrix equality:

$$B^T A t = s\mathbb{1}$$

We can take $s =1 $ generically and let $t$ scale accordingly. Since $B$ and $A$ are invertible, we get

$$t = {(B^TA)}^{-1} \mathbb{1}$$