Quadratic form vanishing at certain points

213 Views Asked by At

Let $A\in\mathbb{R}^{d\times d}$ be a symmetric matrix, and $X_1,\dots, X_n\in \mathbb{R}^d$ be vectors with $n>d$ (if more convenient, one can assume ${\rm span}(X_1,\dots,X_n)=\mathbb{R}^d$.

Assume that, for every $i $, $X_i^T AX_i=0$. What does this tell us about $A$ (of course, as a function of $X_1,\dots,X_n$?

For instance, if we had that $x^T Ax = 0$ for every $x\in\mathbb{R}^d$, then we would have obtained that $A$ is skew-symmetric, that is, $A^T=-A$, which, together with the fact that $A$ is symmetric, would have yielded $A=0$.

Yet another example, take $n=d$, and $X_i=e_i$, the $i ^{th}$ element of the standard basis of $d-$dimensional Euclidean space. Then, all we can say is that the diagonal entries of $A$ are $0$.

If it will make the things simple: Assume that, $n>d^2$, and that, $X_1,\dots,X_n$ are iid random vectors with $d-$iid standard normal entries. What then happens? Can we deduce $A=0$, if conditional on some event, we have more equations than the unknowns?

1

There are 1 best solutions below

0
On BEST ANSWER

Here's a partial answer. If $\ n\ge \frac{d\left(d+1\right)} {2}\ $, and the matrices $\ X_i^\top X_i,\ $ $i= 1,2,\dots,n\ $ span the space of symmetric $\ d\times d\ $matrices (which will be true with probability $1$ whenever the entries of the $\ X_i\ $ are independent normal variates with positive variance) then you can conclude that $\ A=0\ $.

This follows from the fact that if the stated condition is true, then for every pair of integers $\ j,k\ $ with $\ 1\le j\le k \le d\ $, there will exist $\ \alpha_1, \alpha_2, \dots, \alpha_n $ such that $\ \sum_\limits{i=1}^n \alpha_i X_i^\top X_i=e_j^\top e_k + e_k^\top e_j \ $. We then have $\ 0= \sum_\limits{i=1}^n \alpha_i X_i^\top AX_i\ = e_j^\top Ae_k + e_k^\top Ae_j= 2a_{jk}\ $, where $\ a_{jk}\ $ is the entry in the $\ j^\mathrm{\,th}\ $ row and $\ k^\mathrm{\,th}\ $ column of $\ A\ $.