Intersection of kernels of linearly independent smooth 1-forms on $\mathbb R^n$

85 Views Asked by At

I'm trying to solve the following problem:

Let $\omega^1,\dots,\omega^k$ be smooth $1$-forms on $\mathbb R^n$ that are linearly independent at each point of $\mathbb R^n$. For $p\in\mathbb R^n$, define $$ D_p = \bigcap_{i=1}^k \ker \omega^i_p \subset T_p\mathbb R^n \text. $$ Show that each $p\in\mathbb R^n$ has a neighborhood $U_p$ on which there exist smooth vector fields $X_1,\dots,X_{n-k}$ that span $D_q$ for each $q\in U_p$.

What I've done so far is first to notice that for each $i$, I can write $$ \omega^i = \sum_{j=1}^n a_{ij} \, dx^j $$ for a unique choice of real-valued functions $a_{ij}$ on $\mathbb R^n$, which must be smooth since $\omega^i$ is smooth. Now for a point $p\in\mathbb R^n$, I know that since $\omega^1,\dots,\omega^k$ are linearly independent at $p$, the matrix of coefficients $[a_{ij}(p)]$ has maximal rank $k$.

If I define the map $f\colon \mathbb R^n\to\mathbb R^k$ by $$ f^i(y^1,\dots,y^n) = \omega^i_p\biggl( \sum_{j=1}^n b^j \frac{\partial}{\partial x^j} \biggr\rvert_p \biggr) = \sum_{j=1}^n a_{ij}(p) y^j \text, $$ then for each $(b^1,\dots,b^n)\in\mathbb R^n$, $f(b^1,\dots,b^n) = 0$ if and only if $$ \sum_{j=1}^n b^j \frac{\partial}{\partial x^j} \biggr\rvert_p \in D_p \text. $$

Furthermore, the Jacobian of $f$ at any point of $\mathbb R^n$ is just $[a_{ij}(p)]$, which has rank $k$, so given $(b^1,\dots,b^n)\in f^{-1}(0)$, the implicit function theorem tells us that (after renaming the indices)

there exist neighborhoods $A$ of $(b^1,\dots,b^{n-k})\in\mathbb R^{n-k}$ and $B$ of $(b^{n-k+1},\dots,b^n)\in \mathbb R^k$ and there exists a smooth function $h\colon A\to B$ such that for all $(y^1,\dots,y^n)\in A\times B$, $$ f(y^1,\dots,y^n) = 0 \quad \iff \quad (y^{n-k+1},\dots,y^n) = h(y^1, \dots, y^{n-k}) \text. $$

However, I don't know if that helps me to find the necessary neighborhood $U_p$ of $p$ and vector fields $X_1, \dots, X_{n-k}$ on $U_p$. Any advice?


Note - this result is also mentioned in a more general form on page 219 of Methods of Nonlinear Analysis: Applications to Differential Equations (see here), although I didn't notice any comments there on how it would be proven.

1

There are 1 best solutions below

1
On BEST ANSWER

I think this becomes easier by putting less emphasis on coordinates. View your forms a defining for each $x\in\mathbb R^n$ a linear map $\omega_x:\mathbb R^n\to\mathbb R^k$. By the assumption on linear independence, each of the maps $\omega_x$ is surjective. Now take a point $p$, choose a basis $\tilde X_1,\dots,\tilde X_{n-k}$ for $ker(\omega_p)$ and complete by $Y_1,\dots,Y_k$ to a basis of $\mathbb R^n$. View these vectors as constant vector fields on $\mathbb R^n$. By construction, for each $i=1,\dots,k$, $q\mapsto \omega_q(Y_i)$ is a smooth function $\mathbb R^n\to\mathbb R^k$. Basic linear algebra tells you, that the $k$ vectors $\{\omega_p(Y_1),\dots,\omega_p(Y_k)\}$ are linearly independent in $\mathbb R^k$ and thus form a basis. By continuity, there is an open neighborhood $V$ of $p$ in $\mathbb R^n$ such that $\{\omega_q(Y_1),\dots,\omega_q(Y_k)\}$ is a basis for $\mathbb R^k$ for each $q\in V$.

Now for $i=1,\dots,n-k$ also $q\mapsto \omega_q(\tilde X_i)$ is a smooth function $\mathbb R^n\to\mathbb R^k$. On $V$, the value can be expressed in the basis from above, so there are smooth functions $a_{ij}:V\to\mathbb R$ for $i=1,\dots,n-k$ and $j=1,\dots,k$ such that $\omega_q(\tilde X_i)=\sum_ja_{ij}(q)\omega_q(Y_j)$. Then you just define $X_i(q):=\tilde X_i-\sum_ja_{ij}(q)Y_j$. Then by construction the vector fields $X_1,\dots,X_{n-k}$ lie in the kernel of $\omega_q$ for each $q\in V$. Since they coincide with the $\tilde X_i$ and thus are linearly independent in the point $p$, there is a neighborhood $U\subset V$ of $p$, such that $X_1(q),\dots,X_{n-k}(q)$ are linearly independent for each $q\in U$. Hence they have to be a basis of the kernel of $\omega_q$ for each such $q$.

It is possible to rephrase this in terms of the matrices you use, but I think in this form it is easier to see what is going on.