Given the following set of equations in the unknown $X$: \begin{cases} \alpha^T X=0,\\ X^T AX=0, \end{cases} where $\alpha,X\in\mathbb R^3$ are nonzero and $A$ is an invertible real symmetric matrix. I want to prove that the equation set has a “unique” solution (i.e. all solutions are collinear) if and only if $\alpha^T A^{-1}\alpha=0$.
This problem comes from an interesting fact: a line $ax+by+c=0$ is tangent to a conic curve $$ \pmatrix{x&y&1}A\pmatrix{x\\ y\\ 1}=0 $$ if and only if $\alpha=\left(a,b,c\right)^T$ satisfies $\alpha^T A^{-1}\alpha=0$. I’m trying to use linear algebra to prove it.
So far I have proved the sufficiency, i.e. $\alpha^T A^{-1}\alpha=0$ implies that all the solutions are collinear, but when I started trying to prove the necessity, I feel it more difficult.
Someone have any ideas?
By a change of basis, we may assume that $$ \alpha=\pmatrix{1\\ 0\\ 0}\ \text{ and }\ A=\pmatrix{q&b^T\\ b&P}, $$ where $q$ is a scalar and $P$ is $2\times2$.
If $\alpha^TX=X^TAX=0$ has a unique non-trivial solution up to scaling, then $v^TPv=0$ too has a unique non-trivial solution up to scaling. It is not hard to see that $v^TPv=0$ has no solution if $P$ is definite, or two linearly independent solutions if $P$ is zero or indefinite. Therefore $P$ has to be a rank-$1$ semidefinite matrix. But then the $(1,1)$-th cofactor of $A$ and in turn the first entry of $A^{-1}$ are zero. Hence $\alpha^TA^{-1}\alpha=0$.