Let $\mathbb P^n$ denote the projective $n$-space over an algebraically close field $k$, i.e. $\mathbb P^n$ is given by $(\mathbb A^{n+1}\setminus \{0\})/ \sim$ where $\mathbb A^{n+1}$ is the affine $(n+1)$-space and $\sim$ is the equivalence relation identifying points which are scalar multiples of one another, that is for two points $(a_0, \cdots , a_n)$ and $(b_0, \cdots , b_n)$ in $\mathbb A^{n+1}$, $$(a_0, \cdots , a_n) \sim (b_0, \cdots , b_n) \iff \exists \hspace{1mm} \lambda \in k^\times \text{ s.t. } b_j = \lambda a_j \text{ for all }1 \leq j \leq n$$ By a hyperplane in $\mathbb P^n$, I shall mean the zero set of some linear homogenous polynomial $f \in k[x_0, \cdots , x_n]$, that is some polynomial of the form $f(x_0, \cdots , x_n) := \sum_{j=0}^n a_j x_j$ where $(a_0, \cdots , a_n) \in \mathbb P^n$.
I have seen the following result get used in a few contexts before, and although I can see intuitively why it must be true, I have been unable to find a rigorous argument justifying the same:
Fact(?) Let $H$ be a hyperplane and $P$ any point in $\mathbb P^n \setminus H$. Then there exists a linear transformation $A \in \text{GL}_{n+1}(k)$ such that $A(H)$ is the hyperplane $\{(x_0, \cdots , x_n) : x_0=0\}$ and $A(P) = (1, 0, \cdots , 0)$.
I am looking for a complete and concise proof of this result, which is clean if possible. I believe that one possible argument could rest on the following observations:
- $H$ is uniquely determined by any $n$ points on it. So we now pick $n+1$ points $P_1, \cdots , P_n$ on $H$.
- There exists a linear transformation sending $P$ to $(1, 0, \cdots, 0)$ and $P_j$ to $(0, \cdots , 0 , 1, 0, \cdots, 0) \in \mathbb P^n$ ($0$ in the $j$-th slot, here the $n+1$ slots are being called the $0$-th, $1$-st, ..., $n$-th slot slots) for each $1 \leq j \leq n$.
I have however been unable to make these clean and rigorous (I keep getting involved with too many linear equations) and am starting to doubt the accuracy of my intuition. I would really appreciate a complete argument for the above "Fact(?)" or a reference containing the same and if possible, suggestions on how to make my idea work.
Edit (Some Progress): Thanks to Roland's comment, I think I have made some progress:
Let $H$ be given by the equation $\sum_{j=0}^n a_j x_j = 0$. Then in $\mathbb A^{n+1}$, $H$ remains the same (nevertheless I will call it $H_0$ when viewed as a subset of $\mathbb A^{n+1}$) while $P := (p_0, \cdots , p_n)$ becomes the line $L_0 := \{(p_0 t, \cdots , p_n t) : t \in k\}$. I should first show that there is a matrix $A \in \text{GL}_{n+1}(k)$ such that $A(H_0) = H_1$ and $A(L_0)=L_1$, where $H_1 := \{(0, x_1, \cdots , x_n) : x_j \in k\} \subset \mathbb A^{n+1}$ and $L_1$ is the line $\{(t, 0, \cdots , 0) : t \in k\} \subset A^{n+1}$.
So now I can pick $n$ linearly independent points $A_j \in H_0$ ($1 \leq j \leq n$), which is possible since $H_0$ is an $n$-dimensional subspace of $\mathbb A^{n+1}$ and I get a linear transformation $A \in \text{GL}_{n+1}(k)$ which sends $A_j$ to $(0, \cdots , 0 , 1, 0, \cdots 0)$ (with $1$ in the $j$-th slot) for each $1 \leq j \leq n$. Thus $A$ sends $H_0$ to $H_1$. I still have to send $A(L_0)$ to $L_1$ so I need a linear transformation $T \in \text{GL}_{n+1}(k)$ which sends $A(L_0)$ (which is also a line through the origin) to $L_1$ and leaves $H_1$ invariant (as a set).
Finally, we let $T \in \text{GL}_{n+1}(k)$ be the linear transformation that sends $(p_0, \cdots , p_n) \in \mathbb A^{n+1}$ to $(1, 0, \cdots , 0)$ and fixes some basis of $H_1$ pointwise.
Upon getting this last linear transformation $T$, we note that $TA \in \text{GL}_{n+1}(k)$ sends $H_0$ to $H_1$ and $L_0$ to $L_1$ in $\mathbb A^{n+1}$. Therefore $TA$ should also do the required job, namely, send $H$ to $\{(0, x_1, \cdots , x_n)\} \subset \mathbb P^n$ and $P$ to $(1, 0, \cdots 0)$, thus completing the proof and making "Fact(?)"$ a fact.
My only follow-up question: Is this argument correct or are there are any gaps?
A small note: $GL_{n+1}(k)$ acts differently for linear systems than for points. If $H$ is a hyperplane with equation $a^T \times x=0$, then the equation of $AH$ is $(aA^{-1})^T \times x=0$.
Now, your problem is the following: given nonzero vectors $a$ (row) and $x$ (column) with $ax \neq 0$, find an invertible matrix $A$ such that $aA^{-1}=(1,0,\ldots,0)$, $Ax=(r,0,\ldots,0)$ with $r \neq 0$.
Find a basis $(a_2,\ldots,a_{n+1})$ of the $n$-dimensional space of the row vectors orthogonal to $x$. Take $A_1$ the matrix the rows of which are $(a,a_2,\ldots,a_{n+1})$. Then by definition $aA_1^{-1}=(1,0,\ldots,0)$, and $x_1=A_1x$ has zero entries at indices $2 \leq i \leq n+1$, and is nonzero. So we are done.