Writing down the equations for line through two points in projective space

466 Views Asked by At

Let $P$ and $Q$ be distinct points in $\mathbb{P}^n_K.$

I want to write down a a system of homogeneous linear equations which cut out the unique line through $P$ and $Q.$

Let $L_P$ and $L_Q$ be the distinct lines through the origin in $k^{n+1}$ corresponding to $P$ and $Q.$

So I want an $(n-1)\times(n+1)$ matrix $A$ with kernel exactly equal to the plane spanned by $L_P$ and $L_Q.$

Problem. I can't figure out how, in general, to write "an algorithm" for finding such a matrix.

Here's an example of what I mean...


Example. Let $P=(1:0:0:0)$ and $Q=(0:1:1:0)$ in $\mathbb{P}^3.$

Let $$A=\begin{bmatrix} a_1&a_2&a_3&a_4 \\ b_1&b_2&b_3&b_4 \end{bmatrix}$$

Requiring $A$ to vanish on the plane spanned lines $L_P$ and $L_Q$ yields linear equations: $$a_1=0, \\ b_1=0, \\ a_2+a_3=0, \\ b_2+b_3=0.$$ To ensure that this is exactly the kernel, the rows $A$ must be linearly independent.

In this small case, this boils down to requiring that $$(a_2,-a_2,a_4)\neq \lambda (b_2,-b_2,b_4) \; \; \; \; \forall \lambda \in K.$$ Now I can just try plugging in some values to find some that work...

However, there seems to be some redundancy still but maybe this is accounted for by being allowed to scale the equations??

1

There are 1 best solutions below

0
On BEST ANSWER

An algorithm for this is the one you (I hope) already know for solving homogeneous systems of linear equations: Assemble the coefficients (here, the coordinates of the two points that define the line) into a matrix and find a basis for for its null space. The usual way of doing the latter is to compute the row-reduced echelon form of the matrix via Gaussian elimination and then read the null space basis from it using the method described here. Since the coefficient matrix only has two rows, that’s not going to take much work at all.

Taking your example, the coefficient matrix is $$\begin{bmatrix}1&0&0&0\\0&1&1&0\end{bmatrix},$$ which is already in RREF. We then read the null space basis vectors $(0,-1,1,0)^T$ and $(0,0,0,1)^T$ from it, so the line is the meet of those two planes. That is, a system of equations that defines this line is $x_2=x_3$ and $x_4=0$. To verify this solution, take the join of the two points $\lambda P+\mu Q = (\lambda:\mu:\mu:0)$ and plug it into the system of equations. It’s obvious by inspection that both are satisfied by every point on the line.

For a nontrivial example, let’s take $P=(1:2:3:0)$ and $Q=(4:-1:2:3)$. The RREF of the coefficient matrix is $$\begin{bmatrix}1&0&\frac79&\frac23\\0&1&\frac{10}9&-\frac13\end{bmatrix},$$ from which we read the null space basis $\left(\frac79,\frac{10}9,-1,0\right)$, $\left(\frac23,-\frac13,0,-1\right)$, so this line is the meet of the planes $(7:10:-9:0)$ and $(2:-1:0:-3)$.

This is an instance of a more general principle: for any flat, there are two basic dual representations of it as a matrix, which correspond to defining the flat as the join of some set of points or the meet of some set of hyperplanes. The row spaces of these matrices are each other’s orthogonal complements.