Since $\operatorname{codim} \partial M= 1$, then there exists two orthogonal vectors $n_x \in TM$ which are orthogonal to $T(\partial M)$

119 Views Asked by At

So I am studying boundary orientation (e.g. Gulliemin and Pollack page 97, Mukherjee page 182) and it is stated that in the first introduction if

$\dim M \geq 1$ is a smooth manifold with boundary and since $codim \partial M= 1$ at each $x \in \partial M$ there exists two orthogonal vectors in the tangent space $T_x(M)$ to $T_x(\partial M)$ (one inward and one outward). I am 100% sure this is just some trivial linear algebra.

Because if the leftover space has cxdimension $1$, then it is a linear subspace. But how do we know the left over would be orthogonal?

1

There are 1 best solutions below

1
On BEST ANSWER

Let $X^n$ be a smooth manifold with boundary and let $x\in \partial M$. Then let $\phi:U\to X$ be a local parametrization with $\phi(0)=x$ where $U\subset H^n$ is a neighborhood of $0$ in upper half-space (vectors in $\mathbb{R}^n$ with positive $n-$th component). Then, under G&P's definition of tangent space, $T_x(X)$ is the image of $d\phi_0$. Further, if one knows that $d\phi_0(H^n)\subset T_x(M)$ does not depend on $\phi$, then we can define $H_x(X)=d\phi_0(H^n)$.

We want to show that there are only two unit vectors perpendiuclar to $T_x(\partial X)$ and that one points into $H_x(X)$ and the other away from it. First, observe that $T_x(X)=T_x(\partial X)\oplus T_x(\partial X)^{\perp}$. So it is clear that, since $T_x(X)$ has dimension $n$ and $T_x(\partial X)$ has dimension $n-1$, $T_x(\partial X)^{\perp}$ has dimension 1 and therefore only has 2 unit vectors which both span, say $\hat{v}$ and $-\hat{v}$. But $d\phi_0^{-1}$ takes $T_x(\partial X)$ to $\mathbb{R}^{n-1}\times\lbrace 0\rbrace$. So it follows that $d\phi_0^{-1}$ takes $T_x(\partial X)^\perp$ to some one dimensional subspace $W$ of $\mathbb{R}^n$ so that $W\oplus (\mathbb{R}^{n-1}\times\lbrace 0\rbrace)=\mathbb{R}^n$. It follows that $d\phi_0^{-1}(\hat{v})$ is a vector that spans $W$ and hence has some $n-$th component. Thus $\hat{v}$ is the image of some vector in $H^n$ and hence $\hat{v}\in H_x(X)$, as desired.