Exterior Product of 2-Forms in $\mathbb{R}^n$

576 Views Asked by At

So I am studying the book of Arnold, "Mathematical Methods of Classical Mechanics" and I am trying to understand the differential forms.

The exterior product is defined for $1$-forms $ω_1$ and $ω_2$ as the $2$-form: \begin{equation} ω_1 \wedge ω_2 = \begin{vmatrix} ω_1 & ω_2 \\ ω_1 & ω_2 \end{vmatrix}. \end{equation} By now I have studied the linear $- and $-forms. In particular, I would like to prove that every $2$-form $ω^2$ on the $n$-dimensional space with coordinates $x_1,...,x_n$ can be uniquely represented in the form: \begin{equation} ω^2=\sum_{i<j}α_{ij}x_i \wedge x_j \end{equation} A hint is given: Let $e_i$ be the $i$-th basis vector i.e. $x_i(e_i)=1, x_j(e_i)=0, i \neq j$. Then $ω^2(e_i,e_j)=α_{ij}$.

How can I prove that?

2

There are 2 best solutions below

6
On BEST ANSWER

You are effectively asked to show that the $2$-forms $x_i\wedge x_j$ with $1\leq i<j\leq n$ form a basis for the vector space of all $2$-forms. It is instructive to verify that the set of all $2$-forms is indeed a (real) vector space. I'll first prove the fact without the hint, and later with the hint.


Let $\omega^2$ be a $2$-form. Then $\omega^2=\omega_1\wedge\omega_2$ for a pair of $1$-forms $\omega_1,\omega_2\in(\Bbb{R}^n)^{\ast}$. As you note, the coordinate functions $x_1,\ldots,x_n\in(\Bbb{R}^n)^{\ast}$ form a basis for $(\Bbb{R}^n)^{\ast}$, which means that $$\omega_1=\sum_{i=1}^n\beta_i x_i\qquad\text{ and }\qquad\omega_2=\sum_{j=1}^n\gamma_j x_j,$$ for unique $\beta_1,\ldots,\beta_n,\gamma_1,\ldots,\gamma_n\in\Bbb{R}$. It follows that for all $\xi_1,\xi_2\in\Bbb{R}^n$ \begin{eqnarray*} \omega^2(\xi_1,\xi_2) &=&\begin{vmatrix} \sum_{i=1}^n\beta_i x_i(\xi_1) & \sum_{j=1}^n\gamma_j x_j(\xi_1) \\ \sum_{i=1}^n\beta_i x_i(\xi_2) & \sum_{j=1}^n\gamma_j x_j(\xi_2) \end{vmatrix}\\ &=&\sum_{i=1}^n\beta_i x_i(\xi_1)\cdot\sum_{j=1}^n\gamma_j x_j(\xi_2) -\sum_{i=1}^n\beta_i x_i(\xi_2)\sum_{j=1}^n\gamma_j x_j(\xi_1)\\ &=&\sum_{i=1}^n\sum_{j=1}^n\beta_i\gamma_jx_i(\xi_1)x_j(\xi_2)-\sum_{i=1}^n\sum_{j=1}^n\beta_i\gamma_jx_i(\xi_2)x_j(\xi_1). \end{eqnarray*} This seems like quite the mess, but we can group the terms by the factors $x_i(\xi_1)x_j(\xi_2)$ to get $$\sum_{i=1}^n\sum_{j=1}^n(\beta_i\gamma_j-\beta_j\gamma_i)x_i(\xi_1)x_j(\xi_2).$$ By symmetry the coefficients of $x_i(\xi_1)x_j(\xi_2)$ and $x_j(\xi_1)x_i(\xi_2)$ differ only in sign, so this reduces to \begin{eqnarray*} \sum_{i<j}^n(\beta_i\gamma_j-\beta_j\gamma_i)(x_i(\xi_1)x_j(\xi_2)-x_j(\xi_1)x_i(\xi_2)) &=&\sum_{i<j}^n(\beta_i\gamma_j-\beta_j\gamma_i) \begin{vmatrix} x_i(\xi_1) & x_j(\xi_1)\\ x_i(\xi_2) & x_j(\xi_2) \end{vmatrix}\\ &=&\sum_{i<j}^n(\beta_i\gamma_j-\beta_j\gamma_i)x_i\wedge x_j. \end{eqnarray*} So we see that $\alpha_{ij}=\beta_i\gamma_j-\beta_j\gamma_i$. And indeed $$\omega^2(e_i,e_j)= \begin{vmatrix} \sum_{k=1}^n\beta_k x_k(e_i) & \sum_{k=1}^n\gamma_k x_k(e_i) \\ \sum_{k=1}^n\beta_k x_k(e_j) & \sum_{k=1}^n\gamma_k x_k(e_j) \end{vmatrix}= \begin{vmatrix} \beta_i & \gamma_i \\ \beta_j & \gamma_j \end{vmatrix} =\beta_i\gamma_j-\beta_j\gamma_i=\alpha_{ij} $$


With the hint there is a slicker proof as follows:

Let $\omega^2$ be a $2$-form, so that $\omega^2=\omega_1\wedge\omega_2$ for some $\omega_1,\omega_2\in(\Bbb{R}^n)^{\ast}$. Then $$\omega^2:\ \Bbb{R}^n\times\Bbb{R}^n\ \longrightarrow\ \Bbb{R}: \ (\xi_1,\xi_2)\ \longmapsto\ \begin{vmatrix} \omega_1(\xi_1) &\omega_2(\xi_1)\\ \omega_1(\xi_2) &\omega_2(\xi_2) \end{vmatrix},$$ is an $\Bbb{R}$-bilinear map (verify this!). In particular, for any pair $(i,j)$ with $1\leq i<j\leq n$ the $2$-form $x_i\wedge x_j$ is $\Bbb{R}$-bilinear, and for all $1\leq a<b\leq n$ it satisfies \begin{eqnarray*} (x_i\wedge x_j)(e_a,e_b) &=& \begin{vmatrix} x_i(e_a) &x_j(e_a)\\ x_i(e_b) &x_j(e_b) \end{vmatrix}\\ &=&x_i(e_a)x_j(e_b)-x_i(e_b)x_j(e_a)\\ &=&\left\{\begin{array}{ll} 1&\text{ if }a=i\text{ and }b=j\\ 0&\text{ otherwise } \end{array} \right., \end{eqnarray*} Now define $\alpha_{ij}:=\omega^2(e_i,e_j)$. Then the linear combination of bilinear maps $$\sum_{i<j}\alpha_{ij}x_i\wedge x_j,$$ is again bilinear (verify this!), and for all pairs $(a,b)$ with $1\leq a<b\leq n$ it satisfies $$\left(\sum_{i<j}\alpha_{ij}x_i\wedge x_j\right)(e_a,e_b)=\sum_{i<j}\alpha_{ij}(x_i\wedge x_j(e_a,e_b))=\alpha_{ab}=\omega^2(e_a,e_b).$$ Of course $e_1,\ldots,e_n$ is a basis for $\Bbb{R}^n$, so the two bilinear maps agree on all pairs of basis vectors in $\Bbb{R}^n\times\Bbb{R}^n$. It follows that they agree everywhere, i.e. they are the same. So $$\omega^2=\sum_{i<j}\alpha_{ij}x_i\wedge x_j.$$

1
On

So is this correct? First, let us assume that we talk about $\mathbb{R}^n$. By taking the standard basis $B={e_1,...,e_n}$, I then calculate the exterior product of $x_i, x_j$, the basic forms, element of the dual of $\mathbb{R}^n$ on this basis. I get:

\begin{equation} (x_i \wedge x_j)(e_i,e_j)= \begin{vmatrix} x_i(e_i) & x_j(e_i) \\ x_i(e_j) & x_j(e_i) \end{vmatrix}=\begin{vmatrix} 1 & 0 \\ 0 & 1 \end{vmatrix}=1 \end{equation} Now, one can easily prove that $(x_i \wedge x_i)$ is a 2-form that is: \begin{equation} ω^2=x_i \wedge x_j \end{equation} Therefore it holds that: \begin{equation} ω^2(e_i,e_j))=(x_i \wedge x_j)(e_i,e_j)=1 \end{equation} Since the set of all $x_i$ 1-forms is a dual basis for $\left( \mathbb{R}^n \right)^*$ we can deduce that $\forall ω\in (\mathbb{R}^n)^*$ it holds: \begin{equation} ω=\sum_{i=1}^nα_ix_i \end{equation} If it holds for every $ω$ then it holds for $ω^2$ which is billinear and skew symmetric. Therefore: \begin{equation} ω^2=\sum_{i<j}a_{ij}x_i\wedge x_j \end{equation} I think I am missing something.