Using the 'wedge product'

98 Views Asked by At

Given $ω=f\,dx+g\,dy+h\,dz$ such that $ω\wedge dz=0$, what can we conclude about $f, g$ and $h$?

I am going through some textbook exercises and ran across this notation, I am not sure exactly what the $\wedge$ represents. It was always my understanding that $dx\wedge dy$ was just another way of saying $dx\cdot dy$ (the multiplication of the 2 derivatives). Was hoping someone could elaborate on what the $\wedge$ notation exactly means and then maybe I could interpret the question a bit better.

Thanks

2

There are 2 best solutions below

0
On

Let $E$ is a real vector space of dimension $n$, $f$ and $g$ be alternating forms of degree $p$ and $q$, then: $$f\wedge g(x_1,\ldots,x_{p+q}):=\frac{1}{p!q!}\sum_{\sigma\in\mathfrak{S}_{p+q}}\varepsilon(\sigma)f(x_{\sigma(1)},\ldots,x_{\sigma(p)})g(x_{\sigma(p+1)},\ldots,x_{\sigma(p+q)}),$$ where $\mathfrak{S}_{p+q}$ is the permutation group of $\{1,\ldots,p+q\}$ and $\varepsilon(\sigma)$ the signature of $\sigma$.

Here are some fundamental properties of the wedge product:

  • $(f\wedge g)\wedge h=f\wedge(g\wedge h)$,
  • $f\wedge g=(-1)^{\deg(f)\deg(g)}g\wedge f$,
  • $(f+\lambda g)\wedge h=f\wedge h+\lambda g\wedge h$,
  • If $f_1,\ldots,f_n$ are linear forms, then $f_1\wedge\ldots\wedge f_n(x_1,\ldots,x_n)$ is $\det((f_i(x_j))_{1\leqslant i,j\leqslant n})$.

I just want to drop a useful result that is a generalization of your question:

Proposition. Let $E$ be a real vector space of dimension $n$, $\omega\in\Lambda^kE^*$ and $\alpha\in E^*\setminus\{0\}$, then: $$\alpha\wedge\omega=0\iff\omega_{\vert\ker(\alpha)}=0\iff\exists\beta\in\Lambda^{k-1}E^*\textrm{ s.t. }\omega=\alpha\wedge\beta.$$

I recommend you think about a proof of this result, think of the incomplete basis theorem.

8
On

You can see $\wedge$ as a skew-symmetric product. So for your exercise, $\omega \wedge dz = 0$ translates as $$f\,dx\wedge dz + g \,dy\wedge dz = 0,$$and since the "symbols" $\{dx \wedge dy, dx \wedge dz, dy\wedge dz\}$ are linearly independent, we get $f=g=0$. And $h$ can be anything.

Now that we got that out of the way, let's understand what is going on. If $V$ is a vector space and $T,S$ are $k$-linear and $\ell$-linear, we define $${\rm Alt}(T)(v_1,\ldots, v_k) = \frac{1}{k!} \sum_{\sigma \in S_k}{\rm sgn}(\sigma) T(v_{\sigma(1)},\ldots, v_{\sigma(k)}).$$Then $$T \wedge S \doteq \frac{(k+\ell)!}{k!\ell!}{\rm Alt}(T\otimes S),$$where $T\otimes S$ denotes the tensor product of $T$ and $S$. One can prove that if $(e_1,\ldots,e_n)$ is a basis for $V$, then $$\{ e^*_{i_1} \wedge \cdots \wedge e^*_{i_k} \mid 1 \leq i_1 < \cdots < i_k \leq n \}$$is a basis for the space of $k$-linear alternating maps on $V$, where $(e_1^*,\ldots, e^*_{n})$ denotes the dual basis for $V^*$. One applies this linear algebra pointwise, for each tangent space of a manifold. So, for example, $dx \wedge dy$ is something that takes two vector fields $X$ and $Y$ and spits $$(dx \wedge dy)(X,Y) = \begin{vmatrix}dx(X) & dx(Y) \\ dy(X) & dy(Y) \end{vmatrix}.$$