$E$ is a real vector space with dimension $n$ and $E^*$ is dual space of $E$. Assume $\alpha \in Λ^{n-1}(E)$ Show that there exists $\alpha_1,\alpha_2,...,\alpha_{n-1} \in E^*$ such that $$\alpha=\alpha_1\wedge\alpha_2\wedge ...\wedge \alpha_{n-1}$$.
$(n-1)$-alternative tensor on E are decomposable
637 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
Let me make my life easier by just proving this for $E = \mathbb R^n$; I'm going to use the standard basis $e_1, \ldots, e_n$ for that, and the dual basis $\phi_1, \ldots, \phi_n$, where $\phi_i(v) = e_i \cdot v$, that's defined by the usual inner product.
I'm also going to show that there are functionals $\alpha_i$ such that $$ \alpha = r \alpha_1 \wedge \ldots \wedge \alpha_{n-1} $$ i.e., I'm going to end up with a real factor $r$ which needs to be combined with one of the $\alpha_i$ to get the result in the form you asked for.
The dimension of $\Lambda^{n-1} (E)$ is $n$. There's a nice map $F$ from $\Lambda^1(E)$ to $\Lambda^n(E)$ defined by $$ \phi \mapsto \alpha \wedge \phi. $$ The codomain -- alternating $n$-forms on $R^n$ -- is one dimensional, generatoed by the determinant function. For any $\phi$, there's a constant $c(\phi)$ with $F(\phi) = c(\phi) \cdot \det$.
The function $c: \Lambda^{n-1}(E) \to \mathbb R$ is evidently linear. If it's zero, then $\alpha$ is also zero, and I'm gonna leave that case to you. :)
So assuming $c$ is nonzero, it's got a kernel, spanned by an orthonormal basis $v_1, \ldots v_{n-1}$. Let $\alpha_i$ be the dual of $v_i$, i.e., the linear functional such that $$ \alpha_i(u) = v_i \cdot u. $$
Then $\alpha_1 \wedge \ldots \wedge \alpha_{n-1} (v_1, \ldots, v_{n-1}) = 1$. And in fact, $\alpha = r \alpha_1 \wedge \ldots \wedge \alpha_{n-1}$ where $$ r = \alpha(v_1, \ldots, v_{n-1}). $$
Let me do an example: on $R^3$, look at $\alpha = dxdy + dy dz$. For this, we have $$ F(dx) = 1 \\ F(dy) = 0 \\ F(dz) = 1. $$ So my $v$-basis will be $s (-1, 0, 1), (0, 1, 0)$ (where $s = \sqrt{2}/2$). That means (up to constants) that $$ \alpha_1 = (-dx + dz) \\ \alpha_2 = dy $$ so I'm claiming that $$ dx dy + dy dz = r (-dx + dz) \wedge dy $$ which is correct (for $r = -1$).
Now that I've done the construction, I leave the proof of correctness to you. It really amounts to checking that on a basis for $R^n$ provided by the $v_i$s, together with a final unit vector $w$ orthogonal to all of them, i.e., their "cross product", the product of the $\alpha_i$s and the $n-1$-form $\alpha$ give the same values.
Suppose $V$ is a vector space of dimension $n$ over some field, and consider $\alpha\in\Lambda^{n-1}V$ nonzero. If $v_1,\dots,v_n$ is a basis of $V$, then there are scalars $\lambda_1,\dots,\lambda_n$ such that $$\alpha=\sum_{i=1}^n\lambda_i\cdot v_1\wedge\dots\wedge\widehat{v_i}\wedge\dots\wedge v_n$$ Where $v_1\wedge\dots\wedge\widehat{v_i}\wedge\dots\wedge v_n=v_1\wedge\dots\wedge v_{i-1}\wedge v_{i+1}\wedge\dots\wedge v_n$. At least one of the coefficients $\lambda_i$ is nonzero. Hence the linear map $$\phi:V\to\Lambda^n V,\;v\mapsto v\wedge\alpha$$ is nonzero, since if $\lambda_i\neq 0$, then $\phi(v_i)\neq 0$. Its kernel thus is a hyperplane, and let $\alpha_1,\dots,\alpha_{n-1}$ be a basis of $\ker(\phi)$. If $\alpha_n\in V\setminus\ker(\phi)$, then $(\alpha_1,\dots,\alpha_{n-1},\alpha_n)$ is a basis of $V$, and there are coefficients $c_1,\dots,c_n$ such that $$\alpha=\left(\sum_{i=1}^{n-1}c_i\cdot \alpha_1\wedge\dots\wedge \widehat{\alpha_{i}}\wedge\dots\wedge \alpha_{n-1}\wedge \alpha_n\right)+c_n\cdot\alpha_1\wedge\dots\wedge\alpha_{n-1}$$ then the fact that $\alpha_1,\dots,\alpha_{n-1}\in\ker(\phi)$ implies $c_0=\cdots=c_{n-1}=0$, hence $$\alpha=\alpha_1'\wedge\dots\wedge\alpha_{n-1}$$ where $\alpha_1'=c_n\alpha_1$.