Consider the cube ∆ ⊂ Rn defined by inequalities −1 ≤ x1 ≤ 1,...,−1 ≤ xn ≤ 1. Let F : Rn → R be a linear function F(x1,...,xn) = −x1 + x2 −···+ (−1)^n xn. Find all vertices of the cube at which the index of F is equal to 0 and is equal to n.
We have to solve this using convex geometry. From my understanding, it seems like we have to find the maximum and minimum index of this cube using the constraints above where index would the number of "edges" looking down relative to the function L.
If you want to think about this geometrically the vector $(-1,1,-1, cdots, (-1)^n)$ defines a normal vector to a hyperplane, and points at one of the vertexes. That vertex will maximize $F$ and $F(-1,1,\cdots, (-1)^n) = n$
All of the terms of $F(\mathbf x)$ equal $\pm 1$ and if $n$ is odd, it is not possible to add an odd number of $\pm 1$ terms to get the even number $0.$
If $n$ is even, there are ${n\choose \frac 12n}$ vertices where $F(\mathbf x) = 0$