Parallelepiped inequality and probabilistic proof

51 Views Asked by At

Let $(v_i)_{i=1}^n$ be linearly independent unit vectors of $\mathbb{R}^n$ and $V=\{\sum_{i=1}^n \alpha_iv_i:0 \le \alpha_i\le 1\}$.

Given $v=\sum_{i=1}^nx_iv_i \in V$ prove that there exists a vertex $y$ of V (that is $y=\sum y_iv_i$ with $y_i \in \{0,1\}$) such that $||y-v|| \le \sqrt n/2$.

My attempt:

Let $(X_i)_{i=1}^n$ be independent random variables (on a certain probability space $(\Omega, F, P)$) such that $X_i$ has Bernoulli distribution of parameter $x_i$.

Consider $Y=\sum X_iv_i$ and $Z=||Y - v||^2$.

If $Z \le n/4$ a.s., then there exists $z$ in the image of $Z$ such that $P(Z=z)>0$ and $z\le n/4$ because $Z$ is discrete. This means that there exists $y$ in the image of $Y$ with $P(Y=y)>0$, hence a vertex of $V$, and such that $||y-v||^2 \le n/4$;

If $P(Z \gt n/4) \gt 0$, then $P(Z \le n/4)=1-P(Z \gt n/4)> 1-E(Z)/(n/4)\ge 0$ because $E(Z) \le n/4$ as shown below. Hence there exists $z\le n/4$ such that $P(Z=z)>0$ as above.

Now I prove that $E(Z)\le n/4$.

$E(Z)=E(\sum_{i\ne j}(X_i-x_i)(X_j - x_j)v_i v_j + \sum_i (X_i^2-2X_i x_i + x_i^2))=\sum_i x_i(1-x_i)\le n \frac{1}{4}$,

because $x_i(1-x_i) \le 1/4$ for $x_i \in [0,1]$.

Is this correct?