The set of $d\vec{x}_I$ with $I$ increasing is a basis of the vector space $\Lambda^k(\mathbb{R}^n)^*$ of alternating multilinear functions

78 Views Asked by At

I am trying to prove that the set of $d\vec{x}_I$ with $I$ increasing is a basis of the vector space $\Lambda^k(\mathbb{R}^n)^*$ of alternating multilinear functions but I am not sure I have done so correctly; in particular I would like to have some feedback on the second part of the proof, thanks.

What I have done:

If $I=(i_1,\dots, i_k)$ is an ordered $k$-tuple, define $d\vec{x}_I:\underbrace{\mathbb{R}^n\times\dots\times\mathbb{R}^n}_{k\text{ times}}\to\mathbb{R}$ by $ d\vec{x}_I(\vec{v_1},\dots,\vec{v_k}) = \begin{vmatrix} dx_{i_1}(\vec{v_1}) & \dots & dx_{i_1}(\vec{v_k})\\ \vdots & \ddots & \vdots\\ dx_{i_k}(\vec{v_1}) & \dots & dx_{i_k}(\vec{v_k}) \end{vmatrix}=\begin{vmatrix} \vec{v_{i_1,1}} & \dots & \vec{v_{i_1,k}}\\ \vdots & \ddots & \vdots\\ \vec{v_{i_k,1}} & \dots & \vec{v_{i_k,k}} \end{vmatrix},$ where $\vec{v_i}=\begin{bmatrix}v_{1,i}\\v_{2,i}\\\vdots\\v_{k,i}\end{bmatrix}.$

(1) Suppose $\sum\limits_{I\text{ increasing }}c_I d\vec{x_I}=0$, fix an increasing k-tuple $I_0=\{i^0_1,\dots, i^0_k\}$ and consider the set $\{\vec{e_{i^0_1}},\dots,\vec{e_{i^0_k}} \}$: then $d\vec{x_{I_0}}\left(\vec{e_{i^0_1}},\dots,\vec{e_{i^0_k}}\right)=1$ and $d\vec{x_I}\left(\vec{e_{i^0_1}},\dots,\vec{e_{i^0_k}}\right)=0$ for all other $I\neq I_0$ so $\sum\limits_{I\text{ increasing }}c_I d\vec{x_I}=0$ implies $\sum\limits_{I\text{ increasing }}c_I d\vec{x_I}\left(\vec{e_{i^0_1}},\dots,\vec{e_{i^0_k}}\right)=c_{I_0}\cdot 1=c_{I_0}=0$ and by feeding to $\sum\limits_{I\text{ increasing }}c_I d\vec{x_I}=0$ all the other possible sets of basis elements of $\mathbb{R}^n$ arranged in increasing order $I$ we get $c_I=0$ for all $I$ increasing, which implies that the $d\vec{x_I}$ are linearly independent.

(2) Now let $T\in\Lambda^k(\mathbb{R}^n)^*$ be an alternating multilinear form, $T:\underbrace{\mathbb{R}^n\times\dots\times\mathbb{R}^n}_{k\text{ times}}\to\mathbb{R}$: then $$\begin{align} T(\vec{v_1},\dots,\vec{v_k})&=T\left(\sum\limits_{i_1=1}^{n}a_{i_1,1}\vec{e_{i_1}},\sum\limits_{i_2=1}^{n}a_{i_2,2}\vec{e_{i_2}},\dots, \sum\limits_{i_k=1}^{n}a_{i_k,k}\vec{e_{i_k}}\right)\\ &\overset{\text{multilinearity}}{=}\sum\limits_{i_1,i_2,\dots,i_k=1}^{n}a_{i_1,1}a_{i_2,2}\dots a_{i_k,k}T(\vec{e_{i_1}},\vec{e_{i_2}},\dots,\vec{e_{i_k}})\\ &=\sum\limits_{1\leq i_1<i_2<\dots<i_k\leq n}\text{sign}(\sigma)d\vec{x}_{i_1,\dots,i_k}\text{sign}(\sigma)T(\vec{e_{i_1}},\vec{e_{i_2}},\dots,\vec{e_{i_k}})\\&=\sum\limits_{I\text{ increasing}}a_I d\vec{x_I}\end{align}$$ where the next to last inequality follows from the fact that $d\vec{x}_{I_\text{increasing}}=\text{sign}(\sigma) d\vec{x}_I,$ (where $\sigma$ is the permutation that brings the elements of $I$ in increasing order) and $T$ is antisymmetric and the last equality from the fact that $\text{sign}(\sigma)^2=1$. $\square$

1

There are 1 best solutions below

3
On BEST ANSWER

You can shorten and make the proof for spanning set a lot cleaner by considering only basis vectors.

I'll denote $\Bbb{R}^{n}$ by $V$ for easier typing.

Let $T\in \Lambda^{k}(V^{*})$

Define $T_{I}=T(e_{i_{1}},e_{i_{2}},...,e_{i_{k}})$ where $I=(i_{1},...,i_{k})$ is any multi-index(not necessarily increasing).

Now $T$ is alternating implies $T_{I}=0$ if $I$ has repeated index and $T_{J}=\text{sgn}(\sigma)T_{I}$ when $J=\sigma(I)$ for $\sigma\in S_{k}$.

Then $$\sum_{I\,\text{Increasing}}T_{I}d\vec{x}_{I}(e_{j_{1}},...e_{j_{k}})=\sum_{I\,\text{Increasing}}T_{I}(\delta_{IJ})=T_{J}=T(e_{j_{1}},...,e_{j_{k}})$$.

Where $\delta_{IJ}=\begin{vmatrix} \delta_{i_{1}j_{1}}&\cdots& \delta_{i_{1}j_{k}}\\\vdots&\cdots\ &\vdots\\ \delta_{i_{k}j_{1}}&\cdots &\delta_{i_{k}j_{k}}\end{vmatrix}$ . (Here $\delta_{ij}$ is the usual Kronecker delta).

Now the above holds for all multi-index $J$ . And hence for any basis element of $V^{k}$ and hence $\displaystyle T=\sum_{I\,\text{Increasing}}T_{I}d\vec{x}_{I}$. This proves that $\{d\vec{x}_{I}:I\,\text{increasing}\}$ forms spans $\Lambda^{k}(V^*)$.

Your proof for linear independence is correct !.

Note:- This also allows you to conclude that $\dim(\Lambda^{k}(V^{*}))=\binom{n}{k}$ . For a more detailed proof look at Lee's Smooth Manifolds in the chapter on Differential forms