Every 2-vector is decomposable in $a_1\wedge a_2 + \cdots + a_{r-1}\wedge a_r$

219 Views Asked by At

I want to show that every $\omega \in \Lambda^2(V^*)$ on a finite dimensional vector space $V$ can be written as $\omega = \alpha_1 \wedge \alpha_2 + ... + \alpha_{2r-1} \wedge \alpha_{2r}$ where $B=\alpha_1,...,\alpha_n$ is a basis of $V^*$.

Moreover $r$ is independet of $B$ and $\omega \wedge ...$ (r times)$... \wedge \omega \not= 0$ but $\omega \wedge ...$ (r+1 times)$... \wedge \omega = 0$

I know that $\omega$ can be expressed as $\sum_{i,j}(\lambda_{i,j}\alpha_i \wedge \alpha_j)$. I wonder why the coefficients $\lambda_{i,j}$ "disappear".

From Wikipedia it says that $\lambda_{ij}=-\lambda_{ji}$ which makes sense, the rank of the matrix $\lambda_{ij}$ is even which is also understandable. It goes on that the matrix has twice the rank of $\omega$. Why is that so?

At the end it states that a 2-vector $\omega$ has rank $r$ if and only if:

$\omega \wedge ...$ (r times)$... \wedge \omega \not= 0$ but $\omega \wedge ...$ (r+1 times)$... \wedge \omega = 0$

I'm failing to proof this. Can someone help me out?

1

There are 1 best solutions below

3
On BEST ANSWER

Assume that $\alpha_1,...,\alpha_n$ is a basis for $V^*$ and write $\omega = \sum\limits_{i<j} c_{i,j}\cdot \alpha_i\wedge\alpha_j$. It is not necessarily true that all the coefficients $c_{i,j}$ have to be $1$. But we can do some trick :

For example, consider $\omega = 2\cdot v_1\wedge v_2+v_1\wedge v_3+v_2\wedge v_3$ where $v_1,v_2,v_3$ is a basis for $V$. You can first write $\omega = v_1\wedge (2\cdot v_2+v_3)+ v_2\wedge v_3$. And then since $v_2\wedge v_2=0$, you have $\omega = v_1\wedge(2\cdot v_2+v_3)+v_2\wedge(2\cdot v_2+v_3)=(v_1+v_2)\wedge(2\cdot v_2+v_3)$. Note that $v_1+v_2$ and $2\cdot v_2+v_3$ are linearly independent. There is a rigorous way to do this trick to write any $\omega$ in the form $\alpha_1\wedge\alpha_2+...+\alpha_{2r-1}\wedge\alpha_{2r}$ where $\alpha_i$ are linearly independent.

First note that $r$ is the rank of $\omega$ since the vectors appearing above are all linearly independent.

Therefore $w\in\bigwedge^2 W$ where $W$ is the subspace spanned by $\alpha_1,...,\alpha_{2r}$. If you take the wedge of $w$ with itself $r+1$ times, then the result is necessarily $0$ since the result is a $2r+2$-form in the $2r$ dimensional vector space $W$. But if you take the wedge of $\omega$ with itself $r$ times the resulting vector is $n!\cdot\alpha_1\wedge\alpha_2\wedge...\wedge\alpha_{2r}$ which is nonzero.

I believe that many of the arguments above were not really rigorous, in any case you can comment and I can edit to try to clarify.