Existence of Orthonormal Basis of a Metric in a Manifold

2.2k Views Asked by At

Definition: A metric $ g$ on a manifold $ M$ is a tensor field of type $ (0,2)$ such that

(1) it is symmetric, i.e. $ g(v,w)=g(w,v)$ for any $ w,v \in V_p, p\in M$, and

(2) it is non-degenerate, i.e. if $ v_1 \in V_p$ such that $g(v_1,v)=0$ for any $ v \in V_p$, then $ v=0$.

($V_p$ denotes the tangent space at $p$)

Theorem: Given a metric $ g$, there exists some orthonormal basis $v_1,...,v_n$ for $ T_p$ for each $ p \in M$, i.e. $ g(v_\mu,v_\nu)=\pm \delta_\mu^\nu$. Moreover, if $ S,S'$ are two orthonormal basis, then

$$\#\{s \in S: g(s,s)=1\}=\#\{s' \in S': g(s',s')=1\}$$

and

$$\#\{s \in S: g(s,s)=-1\}=\#\{s' \in S': g(s',s')=-1\}$$

For the first part, I am wondering there is anything like the Gram-Schmidt process which can be applied to this case, but since the metric may not be positive definite, I can't see how I can do so.

For the second part, I do not have any idea how to approach it.

Thank you for providing me suggestions.

2

There are 2 best solutions below

5
On BEST ANSWER

Suppose we have a Semi-Riemannian manifold $(M^n,g)$ with metric signature $(n-k,k)$. By definition, each $p \in M$ the map $g_p : T_pM \times T_pM \to \Bbb{R}$ is a non-degenerate, symmetric, bilinear form. We can find an orthonormal basis for $T_pM$ for any $p \in M$ using induction as shown in this lemma (lemma 24 of Barrett O'neill's Semi-Riemannian Geometry, p.50).

For the second question, it's enough to do that in the level of vector space. Suppose $V$ is a $n$-dimensional real vector space endowed with a non-degenerate, symmetric bilinear form $g : V \times V \to \Bbb{R}$. Let $\{e_1,\dots,e_n\}$ is an arbitrary orthonormal basis for $V$.

Let $k \leq n$ be the number of negative values in $\{g(e_i,e_i)=\pm1 : i=1,\dots ,n\}$. The case $k=0$ is trivial. If $k>0$, then $V$ will have a subspaces which $g$ is negative definite, e.g. the span of one of the basis elements. Let $W$ be the subspace of maximal dimension on which $g$ is negative definite. If we can show that $k=\text{dim }W$ then we're done since this number $k$ is independent of basis.

By rearranging the basis $\{e_1,\dots,e_k,e_{k+1},\dots,e_n\}$, we have $g(e_i,e_i)=-1$ for $1\leq i \leq k$ and $g(e_j,e_j)=1$ for $k+1\leq j \leq n$. Since $g$ negatives definite on $X = \text{span} (e_1,\dots,e_k)$ then $k=\text{dim }X \leq \text{dim }W$. To show $k \geq \text{dim }W$, define a map $T : W \to X$ as follow; for any $w = \sum_{i=1}^n w^i e_i \in W$ $$ T(w) = \sum_{i=1}^k w^ie_i. $$ You can check directly that $T$ is injective and therefore $\text{dim }W = 0 + \text{dim Im }T \leq \text{dim }X=k$. Therefore the number $k$ is fixed by $g$, independent of basis.

0
On

For any symmetric bilinear form (not necessarily nondegenerate) $\langle\_,\_\rangle$ on a finite-dimensional vector space $V$ over $\mathbb{R}$, I claim that there exists a basis of $V$, called a good basis, consisting of $$u_1,u_2,\ldots,u_p,v_1,v_2,\ldots,v_q,w_1,w_2,\ldots,w_r\in V$$ such that

  • $p+q+r=\dim_\mathbb{R}(V)$
  • $\langle u_i,u_j\rangle =+\delta_{i,j}$ for $i,j=1,2,\ldots,p$,
  • $\langle v_i,v_j\rangle = -\delta_{i,j}$ for $i,j=1,2,\ldots,q$,
  • $\langle u_i,v_j\rangle=0$ for $i=1,2,\ldots,p$ and $j=1,2,\ldots,q$, and
  • $\langle x,w_k\rangle=0$ for all $x\in V$ and $k=1,2,\ldots,r$.

Here, $\delta$ is the Kronecker delta. Thus, in the basis above, the bilinear form is represented by the matrix $$J_{p,q,r}:=\begin{bmatrix} +I_{p\times p}&0_{p\times q}&0_{p\times r}\\ 0_{q\times p}&-I_{q\times q}&0_{q\times r}\\ 0_{r\times p}&0_{r\times q}&0_{r\times r}\end{bmatrix}\,,$$ where $I_{k\times k}$ is the $k$-by-$k$ identity matrix and $0_{\alpha\times \beta}$ is the $\alpha$-by-$\beta$ zero matrix. For each $x\in V$, we write $\|x\|$ for $\sqrt{\big|\langle x,x\rangle\big|}$, and write $\sigma(x)\in\{-1,0,+1\}$ for the sign of $\langle x,x\rangle$.

First, let $W$ be the kernel of the bilinear form. That is, $W$ consists of all vectors $z\in V$ for which $\langle x,z\rangle=0$ for all $x\in V$. Let $r$ denote the dimension of $W$ over $\mathbb{R}$. We can take $\left\{w_1,w_2,\ldots,w_r\right\}$ to be any basis of $W$. (This also shows that $r$ is independent of the choice of good basis of $V$, as it must be the $\mathbb{R}$-dimension of $W$, a fixed subspace of $V$.) We can from now on assume that $W=0$ (that is, the bilinear form $\langle\_,\_\rangle$ is nondegenerate). Otherwise, we study the vector space $V/W$ with the bilinear form $\langle\!\langle\_,\_\rangle\!\rangle$ defined by $$\langle\!\langle x+W,y+W\rangle\!\rangle:=\langle x,y\rangle\text{ for all }x,y\in V\,.$$

Fix a basis $\left\{z_1,z_2,\ldots,z_n\right\}$ of $V$. Write $[n]:=\{1,2,\ldots,n\}$. We shall perform the following orthonormalization procedure. First, we look at $\|z_1\|$. If $\|z_1\|=0$, then note that, for some index $j\in[n]\setminus\{1\}$, $\langle z_1,z_j\rangle \neq 0$, (as the bilinear form is nondegenerate), and so we can replace $z_1$ by $z_1+tz_j$ where $t\in\mathbb{R}\setminus\{0\}$ is so chosen that $2\,\langle z_1,z_j\rangle +t\langle z_j,z_j\rangle \neq 0$. Hence, we may always assume that $\|z_1\|\neq 0$. Dividing $z_1$ by $\|z_1\|$, we may also assume that $\|z_1\|=1$. Now, we replace $z_j$ for $j=2,3,\ldots,n$ by $$z_j-\langle z_j,z_1\rangle z_1\,.$$ By doing so, we may assume that each $z_j$ is orthogonal to $z_1$ already.

Let $V_1$ denote the orthogonal complement of $z_1$. That is, $V_1$ consists of all vectors $x$ in $V$ such that $\langle x,z_1\rangle=0$. Clearly, $V_1$ is an $(n-1)$-dimensional $\mathbb{R}$-subspace of $V$. By the paragraph above, $V_1$ is the $\mathbb{R}$-span of $z_2,z_3,\ldots,z_n$, and $V_1$ inherits the bilinear form $\langle\_,\_\rangle_1$ from $V$ (by simply restricting $\langle \_,\_\rangle$ onto $V_1\times V_1$). We can then repeat the paragraph above for $V_1$, noting that $\langle\_,\_\rangle_1$ is nondegenerate. Hence, by induction, you may assume that the vectors $z_2,z_3,\ldots,z_n$ are orthogonal, and each $\|z_j\|$ is equal to $1$ for $j=2,3,\ldots,n$.

Then, we let $u_1,u_2,\ldots,u_p$ to be the vectors $z_j$ with $\sigma(z_j)=+1$ ($j\in[n]$). Likewise, $v_1,v_2,\ldots,v_q$ are the vectors $z_j$ with $\sigma(z_j)=-1$ ($j\in[n]$). Note that $p$ and $q$ are also independent of the choice of good bases. However, it is easier to show that, if $S$ is an $n$-by-$n$ real symmetric and $A$ is an $n$-by-$n$ invertible real matrix, then $A^\top\,S\,A$ and $S$ have the same number of positive eigenvalues (with multiplicities), and the same number of negative eigenvalues (with multiplicities). The triple $(p,q,r)$ is called the signature of a symmetric bilinear form.