Multivariate inequality

96 Views Asked by At

I am tring to prove that:

$K = 2$:

$$ \sum_{i < j} p_i \cdot p_j \leq {N \choose 2}(\frac 1N \sum_{i} p_i)^2$$

$K = 3$:

$$ \sum_{i < j < m} p_i \cdot p_j \cdot p_m \leq {N \choose 3}(\frac 1N \sum_{i} p_i)^3$$

where $0 \leq p_i \leq 1$ for $i=1..N$

I simulate these and it seems to be correct even for larger $K$'s.

I think that proving the concavity of the function $\sum_{i \ne j} p_i \cdot p_j$ and $\sum_{i \ne j \ne m} p_i \cdot p_j \cdot p_m$ suffice and computed the hessian but don't see a way to prove it semi-negative.

I checked and the function is not concave.

3

There are 3 best solutions below

0
On

Let $f_0(x_1,...,x_N) = 1$ and: $$f_k(x_1,...,x_N) = \sum_{i=1}^N x_i\cdot f_{k-1}(x_1,.\hat{x}_i,..,x_N) $$

where $\hat{x}_i$ means that $x_i$ does not apper.

The claim is proved if we will show that $f_k(x_1,...,x_N)$ is maximized on $f_k(\alpha,...,\alpha)$ for a fixed $\alpha$ where the maximization is done on vector $\vec{x}$ such that $\sum_i x_i = C$.

We use lagrange multipliers:

Let $$L(\vec{x},\lambda) = f_k(\vec{x}) + \lambda\cdot(\sum_i x_i - C)$$

$\frac{dL}{dx_i} = f_{k-1}(x_1,.\hat{x}_i,..,x_N) - \lambda\cdot C$

Thus: $f_{k-1}(x_1,.\hat{x}_i,..,x_N) = C'$ and from this and the symmetry it follow that $x_i=\alpha$ for all $i$.

0
On

Here's another way to prove the $K=2$ inequality.

I'm going to define $A,B,C$ by $$ A=\sum_{i<j}p_ip_j,\qquad B=\sum_ip_i^2,\qquad C=\sum_{i<j}(p_i-p_j)^2. $$ Note that $$ \bigl(\sum_ip_i\bigr)^2=2A+B,\qquad C\ge0,\qquad C=(N-1)B-2A $$ We are trying to prove $$ A\le{N\choose2}\biggl({1\over N}\sum_ip_i\biggr)^2={N-1\over2N}(2A+B) $$ This is equivalent to $$ 2NA\le2(N-1)A+(N-1)B $$ which is equivalent to $2A\le(N-1)B$, which follows immediately from $C\ge0$.

0
On

Fact 1. Let $p_i \ge 0, i = 1, 2, \cdots, N$ ($N \ge 2$). Let $1\le k \le N$. Prove that $$\sum_{i_1 < i_2 < \cdots < i_k} p_{i_1} p_{i_2} \cdots p_{i_k} \le \binom{N}{k}\left(\frac{1}{N}\sum_{i=1}^N p_i\right)^k.$$

Proof of Fact 1.

If $k = 1$, it is clear. In the following, assume that $k \ge 2$.

Let $Q > 0$ be an arbitrary constant. Consider the maximum of $f(p) := \sum_{i_1 < i_2 < \cdots < i_k} p_{i_1} p_{i_2} \cdots p_{i_k} $ subject to $\sum_{i=1}^N p_i = Q$ and $p_i \ge 0, i = 1, \cdots, N$. Clearly, the maximum is positive.

We claim that at maximum, all $p_i$'s are equal. Assume, for the sake of contradiction, that at maximum, there exist $r < s$ such that $p_r \ne p_s$. Note that $$\sum_{i_1 < i_2 < \cdots < i_k} p_{i_1} p_{i_2} \cdots p_{i_k} = A p_r p_s + B(p_r + p_s) + C$$ where all $A, B, C$ do not contain $p_r, p_s$. If $A = 0$, then $\sum_{i_1 < i_2 < \cdots < i_k} p_{i_1} p_{i_2} \cdots p_{i_k} = 0$ which contradicts the positivity of the maximum. If $A > 0$, we have $$A p_r p_s + B(p_r + p_s) + C < A (p_r + p_s)^2/4 + B(p_r + p_s) + C$$ and thus, if we replace $p_r, p_s$ with $\frac12(p_r + p_s), \frac12(p_r + p_s)$, we get a larger objective value, which contradicts the optimality of $p$.

We are done.