For a differentiable map $f: \mathbb{R^n}\to \mathbb{R^n}$, Show that $f^*({dy_1 \wedge\cdots \wedge dy_n})=\det(df)dx_1\wedge \cdots\wedge dx_n$

793 Views Asked by At

Let $f: \mathbb{R^n}\to \mathbb{R^n}$ be a differentiable map given by $f(x_1,\cdots, x_n) = (y_1,\cdots,y_n)$. Show that $f^*({dy_1 \wedge\cdots \wedge dy_n})=\det(df)dx_1\wedge \cdots\wedge dx_n$

This is one of the exercises in Do Carmo's book "Differential forms and applications".

I assume that by $f(x_1,\cdots, x_n) = (y_1,\cdots,y_n)$, he means that $y_i=f_i(x_1,\cdots,x_n)$.

We have $dy_i = \dfrac{\partial f_i}{\partial x_1}dx_1 + \cdots + \dfrac{\partial f_i}{\partial x_n}dx_n$

We also have $f^*(\omega \wedge \varphi)=f^*{\omega}\wedge f^*{\varphi}$ for any two differential forms. Therefore:

$f^*({dy_1 \wedge\cdots \wedge dy_n})=f^*{dy_1} \wedge \cdots \wedge f^*{dy_n}$

The problem is that I don't know what $f^*{dy_i}$ should be. I tried to calculate that by using the definition of pullback but it got very ugly and complicated. Therefore I think I'm either missing some important point about differential forms or it actually should get very ugly. Which one is it? :D

EDIT: My main problem is that I want to know what $f^*{dy_i}$ should be, because I feel there's still a big gap in my understanding of differential forms.

4

There are 4 best solutions below

6
On

Since the two sides of your equation are maps belonging at each point to a $1$-dimensional space, it suffices to check that they take the same value at a single tuple of tangent vectors, which we take to be $\left(\frac{\partial}{\partial x_1}, \dots, \frac{\partial}{\partial x_n}\right)$, simply because it's very easy to evaluate the right hand side on it, the result being $\det(df)$. It therefore suffices to show that when applying the left hand side to this tuple, the result is $\det(df) = \det\left(\frac{\partial f_j}{\partial x_i}\right)$. This on the other hand follows directly from the formula in my comment above and the definition of $f^*$.

5
On

If you really want to go ahead in that way, then notice that, in very simple terms, you should have by definition $$ f^* \mathrm d y_i = \mathrm d y_i ( \mathrm d f) = (0,\ldots,1,\ldots,0) \left(\frac{\partial f_h}{\partial x_k}\right)_{h,k=1,\ldots,n} = \left(\frac{\partial f_i}{\partial x_1},\ldots,\frac{\partial f_i}{\partial x_n}\right) . $$ Then it is know that $v_1 \wedge\ldots\wedge v_n = \mathrm{det}\left(\matrix{- v_1 -\\ \vdots \\ - v_n - }\right)$ and you are done.

Of course, one can appeal to the theory of fibers bundles to solve it right away.

EDIT: notice that you have $$ \mathrm{d}y_i \;:\; \mathbb{R}^n \longrightarrow \mathbb{R} \;:\; \textbf{v}=(v_1,\ldots,v_n) \longmapsto v_i $$ and $$ \mathrm{d}f \;:\; \mathbb{R}^n \longrightarrow \mathbb{R}^n \;:\; (a_1,\ldots,a_n) \longmapsto \left(\frac{\partial f_h}{\partial x_k}\right)_{h,k} (a_1,\ldots,a_n)^{T} $$ So $$ \mathrm{d}y_i (\mathrm{d}f) \;:\; \mathbb{R}^n \longrightarrow \mathbb{R}^n \;:\; \textbf{v} \longmapsto \left(\left(\frac{\partial f_h}{\partial x_k}\right)_{h,k} \textbf{v}^{T} \right) {}_i $$ that is the $i^{\text{th}}$ row of the product inside the parentheses.

0
On

I'm writing an answer to my own question because it won't fit in comments and my original post would get too long if I edited it again.

First of all, thanks to people for their answers. I'm just writing this one to check my own understanding, because I'm still a bit confused about differential forms.

We have: $f^*{d\omega}=d(f^*\omega)$

On the other hand, we can think of $y_i: \mathbb{R}^n \to \mathbb{R}$ as the projection of $f: \mathbb{R}^n \to \mathbb{R}^n$ onto its i-th coordinate, i.e. $(y_i\circ f)(x_1,\cdots,x_n)=y_i(f(x_1,\cdots,x_n))=f_i(x_1,\cdots,x_n)$

Now we have: $f^*{dy_i}=d{f^*{y_i}}=d(y_i\circ f)=df_i= \dfrac{\partial f_i}{\partial x_1}dx_1 + \cdots + \dfrac{\partial f_i}{\partial x_n}dx_n=dy_i$

Therefore:

$$f^*{(dy_1 \wedge \cdots \wedge dy_n)}=f^*{dy_1} \wedge \cdots \wedge f^*{dy_n}=dy_1 \wedge \cdots \wedge dy_n$$ $$dy_1 \wedge \cdots \wedge dy_n= (\dfrac{\partial f_1}{\partial x_1}dx_1 + \cdots + \dfrac{\partial f_1}{\partial x_n}dx_n) \wedge \cdots \wedge ( \dfrac{\partial f_n}{\partial x_1}dx_1 + \cdots + \dfrac{\partial f_n}{\partial x_n}dx_n)$$

Now we multiply things out and we get the following equal expression:

$$\sum_{\sigma \in S_n} \dfrac{\partial f_1}{\partial x_{\sigma(1)}}\dfrac{\partial f_2}{\partial x_{\sigma(2)}}\cdots\dfrac{\partial f_n}{\partial x_{\sigma(n)}} dx_{\sigma(1)} \wedge dx_{\sigma(2)} \wedge \cdots \wedge dx_{\sigma(n)}$$

This is because the wedge product of $n$-vectors is zero if two of them are the same. Therefore the only non-zero terms are the expressions like $dx_{\sigma(1)} \wedge dx_{\sigma(2)} \wedge \cdots \wedge dx_{\sigma(n)}$ where $\sigma$ is a permutation of $\{1,\cdots,n\}$.

Because of anti-commutativity of the wedge product, The above expression simplifies to:

$$\left(\sum_{\sigma \in S_n} \mathrm{sign(\sigma)} \dfrac{\partial f_1}{\partial x_{\sigma(1)}}\dfrac{\partial f_2}{\partial x_{\sigma(2)}}\cdots\dfrac{\partial f_n}{\partial x_{\sigma(n)}} \right) dx_1 \wedge dx_2 \wedge \cdots \wedge dx_n$$

But the expression in the bracket is the definition of $\det(\dfrac{\partial f_i}{\partial x_j})$.

This completes the proof.

0
On

math.n00b's answer is very good and detailed, I just wanted to add a general observation that pulling back differential forms expressed in the basis involving $dy_i$ is done simply by substituting $y_i=f_i(x_j)$ under differentials and using the chain rule to arrive at a form expressed in $dx_i$.

For example, $f^*{dy_i}=\dfrac{\partial f_i}{\partial x_1}dx_1 + \cdots + \dfrac{\partial f_i}{\partial x_n}dx_n$.

The volume form then receives a Jacobian factor: $f^*({dy_1 \wedge\cdots \wedge dy_n})=\det(Df_p)dx_1\wedge \cdots\wedge dx_n$, where $Df_p$ is the differential of the map represented by the Jacobian matrix.