A sort of extension of Cauchy-Schwarz inequality?

123 Views Asked by At

Let $a_{i}$ and $b_{i}$, $i = 1,\dotsb,n$ be real numbers. Denote $S_{pq}:=\sum_{i=1}^{n}a_{i}^{p}b_{i}^{q}$.

I want to show that, $$ (S_{20}S_{02} - S_{11}^2)S_{00} - (S_{10}S_{02} - S_{01}S_{11})S_{10} + (S_{10}S_{11} - S_{01}S_{20})S_{01} \geq 0,$$ and find the condition on $a$ and $b$ for this inequality to become an equality.

I thought about it in several ways, and one of them is that the left-hand side(LHS) of the inequality is the determinant of the matrix defined below: $$ \begin{bmatrix} S_{00} & S_{10} & S_{01} \\ S_{10} & S_{20} & S_{11} \\ S_{01} & S_{11} & S_{02} \end{bmatrix} $$

Also, I have a mere feeling that it is somewhat connected to Cauchy-Schwarz inequality. A high-dimensional version of it, maybe?

In addition, I found out that if $n=2$, then (LHS) is always 0. This made me think of typical version of Cauchy-Schwarz, where $$n\sum_{i=1}^{n}a_{i}^{2} - (\sum_{i=1}^{n}a_{i})^{2} \geq 0 $$ Here, if $n=1$ then it becomes an equality.

This is one of the reasons that I think the inequality is somewhat related to Cauchy-Schwarz.

Actually I managed to prove it in a geometric way (not perfectly sure though), but I want a proof in more simple and of an algebraic way. Thanks in advance for any help or hint.

Edit: I'm wondering now if I can show that the matrix is nonnegative definite.

1

There are 1 best solutions below

0
On BEST ANSWER

We have $$ S = \begin{pmatrix} S_{00} & S_{10} & S_{01} \\ S_{10} & S_{20} & S_{11} \\ S_{01} & S_{11} & S_{02} \end{pmatrix} = A^T A $$ where $A$ is the $n \times 3$ matrix $$ A = \begin{pmatrix} 0 & 0 & 0 \\ 1 & a_1 & b_1 \\ 1 & a_2 & b_2 \\ \vdots & \vdots & \vdots \\ 1 & a_n & b_n \end{pmatrix} \, . $$

It follows that $$ x^T S x = \Vert A x \Vert^2 \ge 0 $$ for all $x \in \Bbb R^3$, i.e. $S$ is positive semidefinite. It follows that the eigenvalues $\lambda_1, \lambda_2, \lambda_3$ of $S$ are (real and) nonnegative and $\det(S) = \lambda_1\lambda_2\lambda_3 \ge 0$.

  • If the columns of $A$ are linearly dependent then $Sx = A^TAx = 0$ for some nonzero vector $x$, so that $0$ is an eigenvalue of $S$ and $\det(S) = 0$. This is always the case if $n \le 2$.

  • If the columns of $A$ are linearly independent then $x^T S x > 0$ for all nonzero vectors $x$, so that $S$ is positive definite and $\det(S) > 0$.