The convex hull of the vertices of an hypercube contain the hypercube

492 Views Asked by At

The title is clear, I want to proof that the convex hull of the vertices of an hypercube contain the hypercube itself. Geometrically, it is pretty obvious, but I can't come up with a rigorous proof.

I tried inducting on the dimension of $\mathbb R^n$ but this doesn't seem to work nicely. A manual approach would be better I think but for dimension higher than 3 it is really confusing for me to construct a specific linear combination.

2

There are 2 best solutions below

2
On BEST ANSWER

Induction on dimension works just fine. The point $(x_1, \ldots, x_n) \in [0,1]^n$ is a convex combination of the points $(x_1, \ldots, x_{n-1}, 0)$ and $(x_1, \ldots, x_{n-1}, 1)$, which lie on the top and bottom faces of the hypercube and therefore by induction in the convex hull of the top and bottom vertices respectively.

The resulting convex combination is explicitly given by

$$x = \sum_{p \in \{0,1\}^n}\left(\prod_{i=1}^n(1-|x_i-p_i|)\right)p~.$$

0
On

You can show that the product of convex hulls equals the convex hull of products $$\prod_{i=1}^n \operatorname{co}(A_i) = \operatorname{co}(\prod_{i=1}^n A_i)$$

Show that for product of $2$ sets, then use induction. Let's show $$\operatorname{co}(A)\times \operatorname{co}(B) = \operatorname{co}(A\times B)$$

Clearly, $\operatorname{co}(A)\times \operatorname{co}(B)$ is convex, and contains $A\times B$, so it contains $\operatorname{co}(A\times B)$. So only the inclusion $$\operatorname{co}(A)\times \operatorname{co}(B) \subset \operatorname{co}(A\times B)$$ left to prove. Now, let $x= \sum_{a\in A} \lambda_a\, a \in \operatorname{co}(A)$, $y = \sum_{a\in A} \mu_b\, b \in \operatorname{co}(B)$. Now we check that $$(x,y) = \sum_{(a,b)\in A\times B} \lambda_a \cdot \mu_b\, (a,b)$$

$\bf{Added:}$ The above solution provides a writing involving $2^n$ terms. But we can provide a writing with $n+1$ terms. For let $x=(x_1, \ldots, x_n)$ in the unit $n$ dimensional cube. Assume that $$x_1\le x_2 \le \ldots x_n$$ Then $x_k = \lambda_1 + \cdot + \lambda_k$, where $\lambda_i \in [0,1]$, $i=1, n$. Then we have $$(x_1, \ldots, x_n) = (\lambda_1, \lambda_1 + \lambda_2, \ldots, \lambda_1 + \cdots \lambda_n) = \\ = \lambda_1 (1,\ldots, 1) + \lambda_2 (0,1,\ldots, 1) + \cdots \lambda_n(0,\ldots, 0,1) + (1-\lambda_n)(0,0,\ldots,0)$$