Isotropic Subspace Implying the Existence of a Linear Equation Issue

222 Views Asked by At

Given a quadratic form in $n = 2 \nu + 1$ complex dimensions written as $$F = (x^0)^2 + x^1 x^{1'} + ... + x^{\nu} x^{\nu'}$$ a vector $x = (x^0,x^1,...,x^{\nu},x^{1'},...,x^{\nu'})$, with $x^0,x^i,x^{i'} \in \mathbb{C}$, $i = 1,...,\nu$, is isotropic if it satisfies $$F = 0.$$ Given this setup, I would like to understand the following passage here (from this book) which I reproduce in the following quote:

We have seen in Chapter I (Section 10) that any isotropic subspace (i.e., a subspace in which all the vectors are isotropic) has dimension at most $\nu$. If the equations defining an isotropic subspace do not include an equation connecting $x^0,x^1,...,x^{\nu}$ it would be possible to express the $x^{i'}$ components of each vector in the subspace as the same linear combinations of the $x^0,x^1,...,x^{\nu}$ components which are arbitrary: such vectors do not satisfy $F = 0$. We shall establish the equations of an isotropic $\nu$-plane assuming the general case where there is no linear relation between $x^1,x^2,...,x^{\nu}$.

I believe the first sentence can be understood as follows: Any $p$-dimensional isotropic subspace generated by $p$ basis vector $\mathbf{e}_1,...,\mathbf{e}_p$, and the space orthogonal to this space has dimension $n-p$, however since the $\mathbf{e}_i$, $i=1,...,p$, are orthogonal to themselves they live among these $n-p$ vectors so there are only $n-2p$ basis vectors remaining in the total space. Clearly we must have $n - 2p \geq 0$ which means $2p \leq 2 \nu + 1$ so that $p \leq \nu$.

The remaining sentences are trying to justify the fact that a relation of the form $$\eta_0 := \xi_0 x^0 + \xi_1 x^1 + ... + \xi_{\nu} x^{\nu} = 0$$ should exist (where the $\xi_0,\xi_i$, $i=1,...,\nu$, are complex coefficients). The abstract existence of this relation is what I am mainly trying to understand from this passage.

In the case of $n = 3$ it can be inferred directly by taking $F = (x^0)^2 + x^1 x^{1'} = 0$ and writing $x^0 = \sqrt{-x^1} \sqrt{x^{1'}} = \xi_0 \xi_1$ so that $\xi_0 x^0 = - x^1 \xi_1$ implies $\eta_0 = \xi_0 x^0 + \xi_1 x^1 = 0$, however I have no idea what he's trying to say regarding what would happen if we assumed this wasn't possible and what it would imply about $x^{1'}$ being expressible in terms of 'the same linear combination' of $x^0,x^1$ (which are arbitrary and so do not satisfy $F=0$...?), and so cannot see how to directly infer the existence of $\eta_0 = \xi_0 x^0 + \xi_1 x^1 = 0$ without any computations (so that it can easily generalize to higher dimensions).

A potentially ridiculous idea is to re-write $F$ as $$F = (x^0,x^1) \cdot (x^0,x^{1'}) = 0$$ which implies (?) the existence of other vectors $(\xi_0,\xi_1)$ orthogonal to $(x^0,x^1)$ so that $$\eta_0 = (x^0,x^1) \cdot (\xi_0,\xi_1) = \xi_0 x^0 + \xi_1 x^1 = 0$$ holds, it's doubtful he means something like this, but it's worth determining how valid this is. Indeed from it we find $$F = \frac{1}{\xi_0}[\xi_0 x^0 x^0 + \xi_0 x^1 x^{1'}] = \frac{x^1}{\xi_0}[-\xi_1 x^0 + \xi_0 x^{1'}] = \frac{x^1}{\xi_0} (x^{1'},-x^0) \cdot (\xi_0,\xi_1) := \frac{x^1}{\xi_0} \eta_1 = 0$$ which is important later on in this discussion.

Any idea what's going on?

1

There are 1 best solutions below

1
On BEST ANSWER

I think this is basically about Gaussian elimination. Consider a subspace $V$ of $\mathbb C^{2\nu+1}$ with $\dim(V) \leq \nu$ (eg. any isotropic subspace). Then $V$ can be defined by a system $(\Sigma)$ of $m \geq \nu + 1$ independent linear equations: $$(\Sigma) : \forall 1 \leq i \leq m, \quad \quad \quad a_{i,0}x^0 + \sum_{j=1}^{\nu} (a_{i,j}x^j + a'_{i,j} x^{j'}) = 0.$$

Since we have at least $\nu+1$ independant equations, we may combine them to eliminate all the $\nu$ variables $x^{1'}, \ldots, x^{\nu'}$ in one equation by the process of Gaussian elimination. This equation will give you a non trivial relation of the form $\eta_0$.


Edit: (on request of the comment) In the quote, the author intends to prove that the vectors of any isotropic subspace satisfy a non trivial linear equation in the coordinates $x^0,\ldots ,x^{\nu}$. They prove it by contraposition: if the vectors of my subspace does not satisfy any such equation, then it wouldn't be isotropic. I personally find their justification a little bit difficult to grasp, so I suggest proving the statement directly, without contraposition.

On the other hand, for $d\geq 0$, any vector subspace $V$ of $F^n$ of dimension $k$ can be determined by $n-k$ independant linear equations in the coordinates of the vectors. Equivalently, in a more abstract way, any $k$-dimensional subspace of a vector space of dimension $n$ is the intersection of $n-k$ hyperplanes. Thus, in the context of the question above, since $V$ is assumed to by isotropic, it has dimension $k \leq \nu$. Thus it is determined by $m := (2\nu+1) - k \geq \nu + 1$ independent linear equations.

I claim that from now, the isotropy of $V$ is not needed anymore. In fact, the vectors inside any vector subspace of $F^{2\nu+1}$ of dimension $\leq \nu$ must satisfy a non trivial equation in the coordinates $x^0,\ldots ,x^{\nu}$ only. To prove this, we use Gaussian elimination. Let's go back to the system of equation $(\Sigma)$ as above. We want to make linear combinations of the equations $(\Sigma_i)$ for $1\leq i \leq m$, $$(\Sigma_i):\quad a_{i,0}x^0 + \sum_{j=1}^{\nu} (a_{i,j}x^j + a'_{i,j} x^{j'}) = 0,$$ in order to cancel all the variables $x^{1'},\ldots,x^{\nu'}$. This is possible because there are $m \geq \nu + 1$ equations, strictly more than the number $\nu$ of variables we want to eliminate. Thus, there exists a non trivial linear combination $$b_1(\Sigma_1) + \ldots + b_m(\Sigma_m)$$ of the equations $(\Sigma_i)_{1\leq i \leq m}$ such that no variable $x^{1'},\ldots,x^{\nu'}$ occurs in it. Observe that the non triviality of this linear combination is guaranteed by the independence of the equations $(\Sigma_i)$'s. Thus, we have obtained a relation of the form $\eta_0$ as expected.