Let $f$ be a quadratic form over $\mathbb{R}$ and write $$f(x)=xAx^t,$$ where $A\in M_n(\mathbb{R_{+}})$ is symmetric with positive entries and $x=(x_1,x_2,\ldots,x_n)$. Suppose that $A$ has exactly one positive eigenvalue and $n-1$ negative eigenvalues.
IS IT TRUE that for any $0\neq u\in \mathbb{R}_{\geq 0}^n$ and $v\in \mathbb{R}^n$ that is not parallel to $u$, $$(u^tAv)^2>(u^tAu)(v^tAv).$$
I tried to find proof of this result but failed. Anyone can help me or provide some references? Thanks a lot!
Denote the eigenvalues of $A$ in decreasing order as $\lambda_1(>0)>\lambda_2\geq...\geq\lambda_n$ and their corresponding orthonormal eigenvectors as $z_1, z_2, ..., z_n$.
So $z_1$ with $\lVert z_1 \rVert=1$ is the eigenvector to the positive eigenvalue $\lambda_1$, let $U_{\perp}$ be the subspace spanned by the other eigenvectors to the negative eigenvalues. If $v\in U\setminus{0}$, then $v^TAv\leq0$ and the right-hand side of the inequality is zero or negative. Also $u$ cannot be in $U$, otherwise $u^TAu\leq0$, a contradiction to $u>0$
Thus since we know that $u^TAu>0$, going ahead we will only concern ourselves with $v^TAv>0$ because otherwise the inequality is obvious. So indeed, we are concerned with this case: while neither of these is in $U_\perp$, since they are non-parallel, some combination of the eigenvectors corresponding to negative eigenvalues, unique upto scaling, is in $U_\perp$ (or else we have $n+1$ linearly independent vectors in $\mathbb{R}^n$). In this combination, let $z^*$ be one of these eigenvectors with a non-zero weight. If we drop $z^*$, then we must have a linearly independent set again.
Define an $n\times n$ matrix $S$ with the first row as $u^T$ and second row as $v^T$. and the other rows as $z_j^T$s with $z^j$s other than $z_1$ and $z^*$ in decreasing order of corresponding eigenvalues. Thus: $$S=\begin{bmatrix} \cdots & u^T & \cdots \\ \cdots & v^T & \cdots \\ \hline \vdots & z_j^T\;|\; z_j\notin \{z_1,z^*\} & \vdots \\ \end{bmatrix}$$
Clearly $S$ is non-singular because all rows are linearly independent as we argued before. By Sylvester's law of inertia, $SAS^T$ also has one positive and $n-1$ negative eigenvalues. This is how $SAS^T$ looks:
$$SAS^T=\left[\begin{array}{cc|cc} u^TAu & u^TAv & \cdots \\ v^TAu & v^TAv & \cdots \\ \hline \cdots & \cdots & \mathscr{D} \\ \end{array}\right]$$ where $\mathscr{D}$ is the diagonal matrix with diagonal entries being (in decreasing order), the $\lambda_j$s corresponding to the $z_j$s in $S$. Denote the principal submatrix with the first two rows and columns to be $X$. Thus: $$X=\left[\begin{array}{cc|cc} u^TAu & u^TAv\\ v^TAu & v^TAv\\ \end{array}\right]$$
But this means that the product of eigenvalues of $X$ is negative. But this equals the determinant. Thus: $$u^TAu\cdot v^TAv -u^TAv\cdot v^TAu < 0 \iff (u^TAv)^2 > u^TAu\cdot v^TAv$$$$\tag*{$\blacksquare$}$$