pullback of wedge product of 1-forms

408 Views Asked by At

I would like some help understanding one of the equalities in the following proof:

Let $f:\mathbb{R}^n\rightarrow \mathbb{R}^m$ be a differentiable function. If $\alpha^1,\cdots ,\alpha^k$ are $1$-forms in $\mathbb{R}^m$, prove that $f^*(\alpha^1\wedge\cdots \wedge\alpha^k)=f^*(\alpha^1)\wedge\cdots \wedge f^*(\alpha^k)$

Proof:

\begin{align}f^*(\alpha^1\wedge\cdots \wedge\alpha^k)(v_1,\cdots ,v_k)&=(\alpha^1\wedge\cdots \wedge \alpha^k)(df(v_1),\cdots ,df(v_k))\\&=\det(\alpha^i(df(v_j))\\&=\det(f^*\alpha^i(v_j))\\ &=f^*(\alpha^1)\wedge\cdots \wedge f^*(\alpha^k)(v_1,\cdots ,v_k) \end{align}

My question is, why is the second equality true. I know that the wedge product can be defined using determinants, but if we are dealing with differentials what does $\alpha^i(df(v_j))$ actually mean and why would we write out that way?

The definition I am given for the wedge product is:

$$\alpha^1\wedge\cdots \wedge \alpha^k=\sum_{i_1<...<i_k}^{}\frac{\partial (\alpha^1,\cdots,\alpha^k)}{\partial(e^{i^1},\cdots,e^{i_k})}e^{i_1}\wedge\cdots\wedge e^{i_k}$$

2

There are 2 best solutions below

0
On

So as always in differential geometry, you have to keep track of what type your objects are.

Here notice that $df$ is a linear function $\mathbb{R}^n \rightarrow \mathbb{R}^m$ so it spits vectors of $\mathbb{R}^m$.

On the other hand $\alpha^i$ is a 1-forms so it takes a vector in $\mathbb{R}^m$ as argument. Thus it shoud be clear (at least formally) what $\alpha^i(df(v_j))$ means.

You can show (by induction for example, knowing that $(\alpha \wedge \beta) (V_1,V_2)=\alpha(V_1)\beta(V_2)-\alpha(V_2)\beta(V_1)$ for any vectors, that for any $V_1, \ldots,V_k$ in $\mathbb{R}^m$, $$\alpha^1 \wedge \ldots \wedge \alpha^k (V_1, \ldots ,V_k)=det(\alpha^i(V_j)).$$

(To me this was pretty much the definition of wedge products).

Hope this was helpful somehow.

0
On

It really just comes from this result: Let $\alpha^1,\cdots\alpha^k$ be linear functionals on a vector space $V$, and $v_1,\cdots v_k\in V.$ Then, \begin{align*} (\alpha^1\wedge\cdots\wedge \alpha^k)(v_1,\cdots, v_k)&=A(\alpha^1\otimes\cdots\otimes\alpha^k)(v_1,\cdots, v_k)\\ &=\sum\limits_{\sigma\in S_k}(\text{sgn }\sigma)\alpha^1(v_{\sigma(1)})\cdots\alpha^k(v_{\sigma(k)})\\ &=\det [\alpha^i(v_j)] \end{align*} Here, $A$ is the alternating operator $$Af=\sum\limits_{\sigma\in S_k}(\text{sgn }\sigma)\sigma f,$$ and, by definition, $$f\wedge g=\frac{1}{k!\ell!}A(f\otimes g)$$ for a $k$-form $f$ and $\ell$-form $g$. By $\sigma f$, we mean $$(\sigma f)(v_1,\cdots, v_k)=f(v_{\sigma(1)},\cdots, v_{\sigma(k)}).$$

Also, since $\alpha^i$ is a one-form and $df(v_j)$ is a tangent vector, $\alpha^i(df(v_j))$ is perfectly well-defined.