Is it well-known by the Cauchy-Schwarz inequality that $$ \langle x, x\rangle \langle y, y\rangle - \langle x, y\rangle^2 \geq 0 $$ for any $x,y\in H$, $H$ being a Hilbert space with real-valued scalars.
When computing the above expression numerically, sometimes it will be negative (in the order of machine precision) due to round-off errors in the subtraction.
Is there an equivalent expression that is always numerically positive? (E.g., something to the power of 2, a norm of a vector etc.)
There is Lagrange's Identity:
$$\langle x,x \rangle \langle y,y\rangle - \langle x,y \rangle^2 = \sum_{1 \le i < j \le n} (x_i y_j - x_j y_i)^2$$
for $x, y \in \mathbb{R}^n$.