Proving that $\dim(\bigwedge^k(V^*)) = \binom{n}{k}$ without constructing an explicit basis

69 Views Asked by At

Text:

enter image description here enter image description here

Discussion:

I find this argument kind of hard to follow because an explicit basis is never constructed; the argument seems kind of indirect.

I'm relatively comfortable with the first sentence in the paragraph.* But I don't really understand the the purpose of the second sentence, or "big picture" of how the argument is unfolding here. For example, why does the next sentence begin with "On the other hand"? Is there an "if and only if" proof going on here?

Paraphrasing, the argument seems to be "If you know the values of the $k$-form on all $k$-subsets of basis vectors (in ascending order), then you have total knowledge of the $k$-form. Conversely, any $k$-form is an arbitrary prescription of values on those ordered $k$-tuples, and then every other value can be found by using multilinearity and antisymmetry. Therefore... ??? ...the space has dimension $\binom{n}{k}$."

*I have added as an appendix to this question my argument that each exterior $k$-form is uniquely determined by its values on all strictly increasing $k$-subsets of basis vectors of $V$.

Appendix:

Fix a basis $e_1, \dots, e_n$ of $V$. If we know what $\omega^k \in \bigwedge^k(V^*)$ does to every $k$-subset of these $n$ basis vectors, then we have total knowledge of $\omega^k$.

Write

\begin{align*} \omega^k(v_1, \dots, v_k) &= \omega^k \Big( \sum_{i = 1}^n \alpha_i^1 e_i, \dots, \sum_{i = 1}^n \alpha_i^k e_i\Big), \end{align*}

where

\begin{align*} &\alpha_1^1, \dots, \alpha_n^1\\ &\vdots\\ &\alpha_1^k, \dots, \alpha_n^k \end{align*}

are respectively the coordinates of $v_1, \dots, v_k$ with respect to the chosen basis.\

By multilinearity, this becomes

\begin{align*} \sum_{i_1 = 1}^n \cdots \sum_{i_k = 1}^n \alpha_{i_1}^1 \cdots \alpha_{i_k}^k \omega^k(e_{i_1}, \dots, e_{i_k}). \end{align*}

There are $n^k$ elements in the above sum. The alternating property implies that any term containing fewer than $k$ unique basis vectors will be zero, which reduces the total number of terms in the sum to $(n)_k$ (the falling factorial).

Any $k$-subset (with $k$ unique basis vectors) will occur in $k!$ ways, and these are made redundant in some sense by the alternating property because

\begin{align*} \sum_{i = 1}^{k!} \omega^k(e_{i_1}, \dots, e_{i_k})_i &= \Big(\sum_{i = 1}^{k!} \sgn(\sigma_i) \Big) \omega^k(e_{j_1}, \dots, e_{j_k}) \end{align*}

where $j_1 < \dots < j_k$ (i.e. the basis vectors put in ascending order). We can therefore consider each $k$-subset only once, and there are

\begin{align*} \frac{(n)_k}{k!} &= \binom{n}{k} \end{align*}

$k$-subsets of $n$. Therefore, $\omega^k$ is uniquely determined by the $\binom{n}{k}$ values it takes on all ordered $k$-subsets of basis vectors of $V$.

3

There are 3 best solutions below

0
On

For $S=\{s_1,\dots,s_k\}\subseteq\{1,2,\dots,n\}$ with $s_1<s_2<\dots<s_k$, let $E_S$ be the unique alternating multilinear map that maps $(e_{s_1},\dots,e_{s_k})$ to $1$ and $(e_{i_1},\dots,e_{i_k})$ to $0$ whenever $\{i_1,\dots,i_k\}\ne S$.

The implicit claim is that $\{E_S:\,S\subseteq \{1,2,\dots,n\},\ |S|=k\}$ is a basis for $\Lambda^k(V^*)$.

0
On

I appreciate Berci's answer although I'm not sure it precisely answers the question. My best attempt to translate what the book is saying is as follows:

If I know the values of the map on all $\binom{n}{k}$ ordered $k$-subsets of the basis for $V$, then I know the whole map. If I knew fewer values than this, say $\binom{n}{k} - 1$, then I clearly don't have enough information to construct the map: most obviously, there is some ordered subset $(e_{i_1}, \dots, e_{i_k})$ such that I don't know $\omega^k(e_{i_1}, \dots, e_{i_k})$. Therefore, the dimension of this space can't be less than $\binom{n}{k}$, so it must be $\binom{n}{k}$.

It's a little fuzzy because the argument discusses subsets of the basis of $V$ rather than maps in $\bigwedge^k(V^*)$, but that's the best I've got.

Later on the book constructs an explicit basis, so hopefully that will help things make more sense.

0
On

(I think the dual-space aspect is making things less clear here...)

My own picture of how to prove that the dimension of the $k$th exterior power of an $n$-dimensional vector space $V$ is of dimension ${n\choose k}$ is as follows.

First, as discussed in the natural/obvious way in the question, etc., the permutation argument with a choice of basis shows that that dimension is at most ${n\choose k}$. Yes, it is eminently plausible that that is exactly the dimension, and it is indeed, but a little something is appropriate (if not "necessary"? :)) to really prove this.

My own favorite way to understand this is to pair $\bigwedge^k V$ and $\bigwedge^{n-k}V$ to $\bigwedge^n V$ by $x\times y\to x\wedge y$. (We need not choose a basis to describe this map.) We already know that the highest (purportedly non-vanishing) exterior power of $V$ is at most one-dimensional. Thus, the whole issue reduces to proving that that highest exterior power is exactly one-dimensional, and, then, that this pairing is non-degenerate.

The one-dimensionality can be done by constructing a determinant... There are several ways to do this.

Then, the non-degeneracy of the pairing is easy.

But, yes, something has to be done to truly verify these quite-plausible assertions.