Prove or disprove $\tfrac{b^{\top}Ab}{b^{\top}A^{-1}b}\leq\tfrac{\lVert Ab \rVert^{2}}{\lVert b \rVert^{2}}$

98 Views Asked by At

Let $A$ be positive definite and symmetric and $b\neq 0$. Is $\tfrac{b^{\top}Ab}{b^{\top}A^{-1}b}\leq\tfrac{\lVert Ab \rVert^{2}}{\lVert b \rVert^{2}}$ true? Any idea how to prove it, maybe with Cauchy-Schwarz or a simple counterexample?

2

There are 2 best solutions below

0
On

The RHS is the Rayleigh quotient: $$R(A^TA, b) = \dfrac{b^T(A^TA)b}{b^Tb} = \dfrac{\sum_{i=1}^n\alpha^2_i\lambda_i^2}{\sum_{i=1}^n\alpha_i^2},$$ where $b = \sum_i\alpha_iv_i$, with $v_i$'s are the linearly independent eigenvectors and $\lambda_i$ are the eigenvalues of $A$, which are all positive. Now just express the LHS in the same manner and see if your inequality is either true or not.

Edit: I will add the details since I am bored. With the same setup, the LHS is: $$\dfrac{ \sum_i\alpha_iv_i^T \sum_j\alpha_jAv_j}{\sum_i\alpha_iv_i^T \sum_j\alpha_jA^{-1}v_j} = \dfrac{\sum_i\alpha^2_i\lambda_j}{\sum_i\alpha_j^2\lambda_j^{-1}},$$ where we used the fact that real symmetric matrices have complete set of orthonormal eigenvalues and that the eigenvalues of a positive definite matrices are positive. The rest is purely analysis/inequality problem.

Let $a_i = \alpha_i^2\geq 0.$ Cross multiply both sides of the fraction and we get: $$\sum_i a_i\lambda_i^2\sum_ja_j\lambda^{-1}_j - \sum_i a_i\sum_ja_j\lambda_j=\sum_{i,j}a_ia_j(\lambda_i^2\lambda_j^{-1}-\lambda_j)=$$ $$=\sum_{i<j}a_ia_j(\lambda^2_i\lambda_j^{-1}-\lambda_j)+\sum_{i>j}a_ia_j(\lambda^2_i\lambda_j^{-1}-\lambda_j)=$$ $$= \sum_{i<j}a_ia_j(\lambda^2_i\lambda_j^{-1}-\lambda_j+\lambda_j^2\lambda_i^{-1}-\lambda_i) = \sum_{i<j}a_ia_j\dfrac{(\lambda_i-\lambda_j)^2(\lambda_i+\lambda_j)}{\lambda_i\lambda_j}\geq 0.$$

Alternatively, one can cleverly use the Power-Mean inequality combined with Jensen's to prove it as well.

3
On

Let $f(p) = \log (b^T A^p b)$. Then, taking logs of both sides, your inequality says $f(1) - f(-1) \le f(2) - f(0)$. And this is true because $f(p)$ is a convex function of $p$. To check that: $$ \eqalign{\dfrac{d^2}{dp^2} f(p) &= \dfrac{d}{dp} \dfrac{b^T A^p \log(A) b}{b^T A^p b}\cr & = \dfrac{b^T A^p \log(A)^2 b}{b^T A^p b} - \dfrac{(b^T A^p \log(A) b)^2}{(b^T A^p b)^2}\cr &= \dfrac{(b^T A^p b)(b^T A^p \log(A)^2 b) - (b^T A^p \log(A))^2}{(b^T A^p b)^2} }$$ and $b^T A^p \log(A) b \le (b^T A^p b)(b^T A^p \log(A)^2 b)$ by Cauchy-Schwarz.