Hyperbolic space and its properties

298 Views Asked by At

This is Theorem 1.4.5 from Scharlau's book Quadratic and Hermitian Forms

Let $(V, b)$ be a regular $2n$-dimensional bilinear space. The following conditions are equivalent:

(i) $(V,b)\cong\ H(W)$ for an $n$-dimensional vector space $W$.

(ii) $V$ contains an $n$-dimensional totally isotropic subspace $W$.

iii) $(V,b) \cong \langle B\rangle$ with $B$ as matrix $$\begin{pmatrix} 0 & C \\ C^T & B' \end{pmatrix}.$$ where $0$, $C$, $D$ are $n \times n$ matrices

IV) $(V,b) \cong \langle B\rangle$ with $B$ as matrix $$\begin{pmatrix} 0 & E \\ E & 0 \end{pmatrix}.$$ where E denotes the $n \times n$ unit matrix.

v) $(V,b) \cong\langle B\rangle$ with B as matrix $$\begin{pmatrix} A & 0 \\ 0 & -A \end{pmatrix}.$$ where A denotes the n x n invertible matrix.

vi) $(V,b) \cong\langle 1,.....1,-1,......-1\rangle \cong \langle1,-1.....1,-1,......1,-1\rangle$

Hyperbolic space and isomorphism previous discussion regarding this can be found.

I don't understand how (ii) implies (iii) ;and (iv) implies (i); proof is given in book but can someone elaborate proof of this two implications.

1

There are 1 best solutions below

7
On BEST ANSWER

The implication (ii)$\Rightarrow$(iii). We just do what is suggested in the proof given in the book.

Complete an arbitrary basis $e_1,\dots,e_n$ of $W$ to a basis of $V$. The matrix $b$ with respect to this basis has the required form.

If we have a basis $e_1,\dots,e_n,f_1,\dots,f_n$ for $V$ such that $e_1,\dots,e_n$ is a basis of $W$, then the upper left quarter of the matrix has to be zero, simply because $W$ is totally isotropic, which implies that $b(e_i,e_j)=0$ for any $i,j\in\{1,2,\dots,n\}$. The the symmetry implies that the lower left corner is the transpose of the upper right corner. So we get that the matrix is $$B= \begin{pmatrix} 0 & C \\ C^T & D \\ \end{pmatrix}. $$ Since $(V,b)$ is regular, the matrix $B$ is regular (Corollary 3.2). Therefore $\det(C)\det(C^T)=\det^2(C)\ne0$ and, consequently, $\det(C)\ne0$.

From matrix representation to isomorphism with $\mathbb H(W)$.

We know from Theorem 2.2 that two bilinear spaces are isomorphic if and only if the corresponding matrices are congruent. So to prove that (iv) implies (i) it suffices to show that $\mathbb H(W)$ has the matrix $$\begin{pmatrix} 0 & E \\ E & 0 \\ \end{pmatrix}$$ w.r.t. some basis. To show this it suffices to choose arbitrary basis $e_1,\dots,e_n$ of $W$ and the dual basis $e_1^*,\dots,e_n^*$ of $W^*$. Here $e_i^*\in W^*$ is determined by $e_i^*(e_j)=\delta_{ij}$, i.e., $$e_i^*(e_j)= \begin{cases} 1, & i=j, \\ 0, & i \ne j. \end{cases} $$

Then $(e_1,0)\dots,(e_n,0),(0,e_1^*),\dots,(0,e_n^*)$ is a basis of $W\oplus W^*$ and the matrix w.r.t. this basis has precisely the required form. Just notice that $b((e_i,0),(e_j,0))=b((0,e_i^*),(0,e_j^*))=0$ for any $i$, $j$. (See also Lemma 4.2.) And also we have $$b((e_i,0),(0,e_j^*))=e_j^*(e_i)=\delta_{ij}= \begin{cases} 1, & i=j, \\ 0, & i \ne j. \end{cases} $$

Explicit description of the isomorphism. It seems that the OP is interested specifically in the proof given in the book. (Although in my opinion the details of this proof are very similar to checking that two bilinear spaces with the same matrix representation are isomorphic. And in this specific case finding a suitable basis such that we get the same matrix is rather straightforward - as I tried to show about.) Anyway, here are some comments on the proof of this implication as described in the book:

Let $e_1,\dots,e_n,e'_1,\dots,e'_n$ be the basis of $V$ with respect to which the matrix of $b$ has the form $\begin{pmatrix}0&E\\E&0\end{pmatrix}$. Let $W=e_1K+\dots+e_nK$ and let $e_1^*,\dots,e_n^*$ be the dual basis of $W^*$. Then $\alpha \colon V\to W\oplus W^*$ defined by $\alpha(e_i)=e_i$, $\alpha(e_i')=e_i^*$ is an isometry from $(V,b)$ to $\mathbb H(W)$. To prove that $\alpha$ is an isometry, first note that $\alpha$ behaves as an isometry on the basis vectors and the result then follows for arbitrary linear combinations of the basis vectors.

Notice that the author of the book chosen a different notation for elements of $W\oplus W^*$ as I did above. (In my notation I would have written $\alpha(e_i)=(e_i,0)$ and $\alpha(e_i')=(0,e_i^*)$.

We want to show that $$h(\alpha x,\alpha y)=b(x,y)$$ for any $x,y\in V$.

This equation is true if $x$ and $y$ belong to the basis. Indeed, we have \begin{align*} b(e_i,e_j)&=h(e_i,e_j)=0\\ b(e'_i,e'_j)&=h(e_i^*,e_j^*)=0\\ b(e_i,e'_j)&=h(e_i,e_j^*)=e_j^*(e_i)=\delta_{ij} \end{align*}

Then it remains to show that the same is true for linear combinations of these vectors. This is a consequence of bilinearity.