Uniqueness of tangent to a conic section

76 Views Asked by At

Given the following set of equations in the unknown $X$: \begin{cases} \alpha^T X=0,\\ X^T AX=0, \end{cases} where $\alpha,X\in\mathbb R^3$ are nonzero and $A$ is an invertible real symmetric matrix. I want to prove that the equation set has a “unique” solution (i.e. all solutions are collinear) if and only if $\alpha^T A^{-1}\alpha=0$.

This problem comes from an interesting fact: a line $ax+by+c=0$ is tangent to a conic curve $$ \pmatrix{x&y&1}A\pmatrix{x\\ y\\ 1}=0 $$ if and only if $\alpha=\left(a,b,c\right)^T$ satisfies $\alpha^T A^{-1}\alpha=0$. I’m trying to use linear algebra to prove it.

So far I have proved the sufficiency, i.e. $\alpha^T A^{-1}\alpha=0$ implies that all the solutions are collinear, but when I started trying to prove the necessity, I feel it more difficult.

Someone have any ideas?

2

There are 2 best solutions below

1
On BEST ANSWER

By a change of basis, we may assume that $$ \alpha=\pmatrix{1\\ 0\\ 0}\ \text{ and }\ A=\pmatrix{q&b^T\\ b&P}, $$ where $q$ is a scalar and $P$ is $2\times2$.

If $\alpha^TX=X^TAX=0$ has a unique non-trivial solution up to scaling, then $v^TPv=0$ too has a unique non-trivial solution up to scaling. It is not hard to see that $v^TPv=0$ has no solution if $P$ is definite, or two linearly independent solutions if $P$ is zero or indefinite. Therefore $P$ has to be a rank-$1$ semidefinite matrix. But then the $(1,1)$-th cofactor of $A$ and in turn the first entry of $A^{-1}$ are zero. Hence $\alpha^TA^{-1}\alpha=0$.

0
On

In fact, the condition is "the system has a solution of multiplicity $2$". That follows is an elementary proof. We consider only the points that stand at a finite distance.

Let $ax+by+c=0$ be a line $D$; in particular $a\not= 0$ or $b\not= 0$. For example $b\not= 0$; the equation of $D$ is $y=ux+v$, that is $\alpha=[u,-1,v]^T$. We put $A[x,ux+v,1]^T=[b_1,b_2,b_3]^T$.

The intersection of $D$ and the conic is given by

$[x,ux+v,1]A[x,ux+v,1]^T=0$, that is (1) $xb_1+(ux+v)b_2+b_3=0$.

and has order $2$, that can be written (derivative with respect to $x$)

$[1,u,0]A[x,ux+v,1]^T=0$, that is (2) $b_1+ub_2=0$.

From (1) and (2) we deduce (3) $vb_2+b_3=0$.

From (2) and (3), there is $\lambda$ s.t. $[b_1,b_2,b_3]^T=\lambda\alpha$.

Finally the last condition to satisfy is

$\alpha^TX=\alpha^T A^{-1}[b_1,b_2,b_3]^T=\lambda \alpha^TA^{-1}\alpha=0$

and we are done.