Incidence correspondence of smooth hypersurfaces of degree $5$ in $\mathbb{P}^4$

105 Views Asked by At

I'm dealing with Exercise 11.11(a) in Gathmann's 2021 Notes of Algebraic Geometry, which is:

Exercise 11.11 (Let $K=\mathbb{C}$) As in Exercise 10.23(b) let $U\subset\mathbb{P}^{\binom{4+5}{4}-1}=\mathbb{P}^{125}$ be the set of all smooth ($3$-dimensional) hypersurfaces of degree $5$ in $\mathbb{P}^4$. > (a) Using the Jacobi criterion, show that the incidence correspondence \begin{equation*} \{(X,L)\in U\times G(2,5)\mid L \text{ is a line contained in }X \} \end{equation*} is smooth of dimension $125$, i.e., of the same dimension as $U$.

Following the same lines as the proof of Lemma 11.4 in the note, I get the following (incomplete) argument:

Let $(X,L)$ be a point in the incidence correspondence, by a linear change of coordinates we may assume that $L=\text{Lin}(e_1,e_2)$, hence locally around this point $L\in G(2,5)$ in the Zariski topology we can use the affine coordinates on the Grassmannian as in Construction 8.18, namely $a_2,a_3,a_4,b_2,b_3,b_4\in \mathbb{C}$ corresponding to the line in $\mathbb{P}^4$ given as the row span of the matrix \begin{equation*} \left(\begin{matrix} 1 &0 &a_2 &a_3 &a_4\\ 0 &1 &b_2 &b_3 &b_4 \end{matrix}\right), \end{equation*} with the point $ (a_2,a_3,a_4,b_2,b_3,b_4) =(0,0,0,0,0,0)$ corresponding to $L$. On the space $U$ of smooth hypersurfaces of degree $5$ we use the coordinates $(c_\alpha)_{\alpha}$ similarly as in Construction 11.3, i.e., $(c_\alpha)_{\alpha}$ corresponds to the hypersurface $V_p(f_c)$, where $f_c:= \sum_{\alpha} c_\alpha x^\alpha$ and $\alpha$ runs over all quadruples of non-negative indices $(\alpha_0,\alpha_1,\alpha_2,\alpha_3,\alpha_4)$ with $\sum_i \alpha_i = 5$. Taking this together, in the coordinates $(c,a,b)=((c_\alpha),a_2,a_3,a_4,b_2,b_3,b_4)$ on $U\times G(2,5)$ the incidence correspondence is then given by \begin{equation*} \begin{aligned} (c,a,b) \text{ is in the incidence correspondence }&\Leftrightarrow f_c(s(1,0,a_2,a_3,a_4)+t(0,1,b_2,b_3,b_4))=0 \text{ for all }s,t\\ &\Leftrightarrow \sum_{\alpha} c_\alpha s^{\alpha_0} t^{\alpha_1}(sa_2+tb_2)^{\alpha_2}(sa_3+tb_3)^{\alpha_3}(sa_4+tb_4)^{\alpha_4}=0 \text{ for all }s,t\\ &\Leftrightarrow\colon \sum_{i}s^it^{5-i}F_i(c,a,b)=0 \text{ for all }s,t\\ &\Leftrightarrow F_i(c,a,b)=0\text{ for }0\leq i\leq 5. \end{aligned} \end{equation*} Since $\dim G(2,5)= 2(5-2)=6$ by Corollary 8.19, it suffices to show that the Jacobian matrix $J:= \frac{\partial(F_0,F_1,F_2,F_3,F_4,F_5)}{\partial(a_2,a_3,a_4,b_2,b_3,b_4)}$ is invertible at $a=b=0$ by Corollary 10.14(b).

To compute $J$, note that \begin{equation*} \frac{\partial}{\partial a_2}\left(\sum_i s^i t^{5-i} F_i\right)\Bigg|_{a=b=0} = \frac{\partial}{\partial a_2} f_c(s,t,sa_2+tb_2,sa_3+tb_3,sa_4+tb_4)\bigg|_{a=b=0} = s\frac{\partial f_c}{\partial x_2}(s,t,0,0,0). \end{equation*} The $(s,t)$-coefficients of this polynomial are the first column in the matrix $J$. Similarly, the other columns are obviously $s\frac{\partial f_c}{\partial x_3}(s,t,0,0,0)$, $s\frac{\partial f_c}{\partial x_4}(s,t,0,0,0)$, $t\frac{\partial f_c}{\partial x_2}(s,t,0,0,0)$, $t\frac{\partial f_c}{\partial x_3}(s,t,0,0,0)$ and $t\frac{\partial f_c}{\partial x_4}(s,t,0,0,0)$. Hence, if the matrix $J$ was not invertible, there would be a relation \begin{equation*} (\lambda_2 s+\mu_2 t)\frac{\partial f_c}{\partial x_2}(s,t,0,0,0)+(\lambda_3 s+\mu_3t)\frac{\partial f_c}{\partial x_3}(s,t,0,0,0) +(\lambda_4s+\mu_4t)\frac{\partial f_c}{\partial x_4}(s,t,0,0,0)=0 \end{equation*} identically in $s,t$ with $(\lambda_2,\lambda_3,\lambda_4,\mu_2,\mu_3,\mu_4)\in \mathbb{C}^6\setminus\{0\}$.

However, the rest of the argument of Lemma 11.14 seems not apply here. In Lemma 11.14, the matrix $J$ is $4\times 4$, and the rest argument is:

Hence, if the matrix $J$ was not invertible, there would be a relation \begin{equation*} (\lambda_2 s+\mu_2 t)\frac{\partial f_c}{\partial x_2}(s,t,0,0)+(\lambda_3 s+\mu_3t)\frac{\partial f_c}{\partial x_3}(s,t,0,0) =0 \end{equation*} identically in $s,t$ with $(\lambda_2,\lambda_3,\mu_2,\mu_3)\in \mathbb{C}^4\setminus\{0\}$. As homogeneous polynomials in two variables always decompose into linear factors, this means that $\frac{\partial f_c}{\partial x_2}(s,t,0,0)$ and $\frac{\partial f_c}{\partial x_3}(s,t,0,0)$ must have a common linear factor, i.e. that there is a point $p=(p_0,p_1,0,0)\in L$ with $\frac{\partial f_c}{\partial x_2}(p)=\frac{\partial f_c}{\partial x_3}(p)=0$.

and then a contradiction follows from that $p\in L\subset X$ and $X$ is smooth.

It seems to me that the conclusion that "$\frac{\partial f_c}{\partial x_2}(s,t,0,0)$ and $\frac{\partial f_c}{\partial x_3}(s,t,0,0)$ must have a common linear factor" follows from the fact that the polynomial ring is a UFD, using the equality $$(\lambda_2 s+\mu_2 t)\frac{\partial f_c}{\partial x_2}(s,t,0,0)=-(\lambda_3 s+\mu_3t)\frac{\partial f_c}{\partial x_3}(s,t,0,0).$$ But this does not apply to my case where there are three terms of $(\lambda_i s+\mu_i t)\frac{\partial f_c}{\partial x_i}(s,t,0,0,0)$. Moreover, if for example $\lambda_2=\mu_2=0$, then when there are two terms we get $(\lambda_3 s+\mu_3t)\frac{\partial f_c}{\partial x_3}(s,t,0,0)=0 $, which is fine because we then have $\frac{\partial f_c}{\partial x_3}(s,t,0,0)=0$, and the point $p\in L$ can be taken as any point such that $ \frac{\partial f_c}{\partial x_2}(p)=0$; but when there are three terms as in my case, then there seems to be no relation between the zero locus of $\frac{\partial f_c}{\partial x_2}(s,t,0,0,0)$ and that of $\frac{\partial f_c}{\partial x_3}(s,t,0,0,0)$ and $\frac{\partial f_c}{\partial x_4}(s,t,0,0,0)$.

Can these problems be resolved with some alternative arguments? If not, how can I show the desired invertibility of my $J$ then?

Thanks in advance for any help.

1

There are 1 best solutions below

0
On BEST ANSWER

I asked my professor and it turns out that my previous $J$ is in general not invertible. Indeed, $J$ is only a part of the Jacobian matrix related to the incidence correspondence, and to show that the Jacobian is invertible, the partial derivatives with respect to $c_\alpha$’s must be taken in to consideration.

Now that following my previous argument, it suffices to show that the Jacobian matrix $J:=\frac{\partial(F_0,F_1,F_2,F_3,F_4,F_5)}{\partial(c,a,b)}$ is of full rank $6$.

To compute the rank, note that \begin{equation*} \frac{\partial}{\partial c_\alpha}\left(\sum_i s^i t^{5-i} F_i\right)\Bigg|_{a=b=0} = \frac{\partial}{\partial c_\alpha} f_c(s,t,sa_2+tb_2,sa_3+tb_3,sa_4+tb_4)\bigg|_{a=b=0} = \begin{cases} s^{\alpha_0}t^{\alpha_1} &\text{if }\alpha_0+\alpha_1=5,\\ 0 &\text{othewise.} \end{cases} \end{equation*} The $(s,t)$-coefficients of these polynomials for each $\alpha$ are the columns in the matrix $J$. As $\alpha$ varies in $\{(i,5-i,0,0,0)\mid i=0,1,2,3,4,5\}$, it follows that there is a $6\times 6$ identity submatrix in $J$, concluding that $J$ is of full rank $6$, as desired.