Tighter version of Cauchy-Schwarz?

333 Views Asked by At

I checked numerically that $$ \left( \sum_N p_N \dfrac{C_N}{D_N^2} \right) \left( \sum_N p_N D_N \right) \geq \left( \sum_N p_N \dfrac{C_N}{D_N} \right) $$ where $$ \sum_N p_N = 1 \;, \quad 0\leq p_N\leq 1 \;, \quad 0\leq \dfrac{C_N}{D_N} \leq 1 \;, $$ but I'm not able to prove it analytically. Using Cauchy-Schwarz gives the bound $$ \left( \sum_N p_N \dfrac{C_N}{D_N^2} \right) \left( \sum_N p_N D_N \right) \geq \left(\sum_N \sqrt{p_N \dfrac{C_N}{D_N^2} p_N D_N}\right)^2 = \left(\sum_N p_N \sqrt{\dfrac{C_N}{D_N}} \right)^2 $$ which is smaller than $\left( \sum_N p_N \dfrac{C_N}{D_N} \right)$.

Is there a way to prove my first inequality?

2

There are 2 best solutions below

2
On BEST ANSWER

Smallest counter-example:

  • $p_1 = p_2 = 1/2$

  • $C_1 = 0, C_2 = 1$

  • $D_1, D_2 > 0$ and exact values to be determined later

  • $LHS = {1 \over 2 D_2^2} {D_1 + D_2 \over 2} = {1 \over 4} {D_1 + D_2 \over D_2^2}$

  • $RHS = {1 \over 2 D_2} = {1 \over 4}{D_2 + D_2 \over D_2^2},$ so RHS > or = or < LHS depending on $D_2$ > or = or < $D_1$. Taking $D_2 > D_1$ gives a counter-example.

More abstractly, interpreting $p_i$ as probabilities and $C,D$ as random variables, $LHS = E[{C\over D^2}] E[D]$ and $RHS =E[{C\over D^2} D] = E[{C\over D}],$ so

$$RHS - LHS =Cov({C\over D^2}, D)$$

and the counter-example is simply a case where, when $C/D^2$ increases, $D$ also increases, so the covariance is positive.

1
On

I don't think your conjecture is true. Although you might have finite sums in mind, you can test your hypothesis by summing over all $N \in \mathbb N$ with $$ p_N = \left( \frac 12 \right)^N\quad C_N = \left( \frac 15 \right)^N\quad D_N = \left( \frac 45 \right)^N.$$