Text:
Discussion:
I find this argument kind of hard to follow because an explicit basis is never constructed; the argument seems kind of indirect.
I'm relatively comfortable with the first sentence in the paragraph.* But I don't really understand the the purpose of the second sentence, or "big picture" of how the argument is unfolding here. For example, why does the next sentence begin with "On the other hand"? Is there an "if and only if" proof going on here?
Paraphrasing, the argument seems to be "If you know the values of the $k$-form on all $k$-subsets of basis vectors (in ascending order), then you have total knowledge of the $k$-form. Conversely, any $k$-form is an arbitrary prescription of values on those ordered $k$-tuples, and then every other value can be found by using multilinearity and antisymmetry. Therefore... ??? ...the space has dimension $\binom{n}{k}$."
*I have added as an appendix to this question my argument that each exterior $k$-form is uniquely determined by its values on all strictly increasing $k$-subsets of basis vectors of $V$.
Appendix:
Fix a basis $e_1, \dots, e_n$ of $V$. If we know what $\omega^k \in \bigwedge^k(V^*)$ does to every $k$-subset of these $n$ basis vectors, then we have total knowledge of $\omega^k$.
Write
\begin{align*} \omega^k(v_1, \dots, v_k) &= \omega^k \Big( \sum_{i = 1}^n \alpha_i^1 e_i, \dots, \sum_{i = 1}^n \alpha_i^k e_i\Big), \end{align*}
where
\begin{align*} &\alpha_1^1, \dots, \alpha_n^1\\ &\vdots\\ &\alpha_1^k, \dots, \alpha_n^k \end{align*}
are respectively the coordinates of $v_1, \dots, v_k$ with respect to the chosen basis.\
By multilinearity, this becomes
\begin{align*} \sum_{i_1 = 1}^n \cdots \sum_{i_k = 1}^n \alpha_{i_1}^1 \cdots \alpha_{i_k}^k \omega^k(e_{i_1}, \dots, e_{i_k}). \end{align*}
There are $n^k$ elements in the above sum. The alternating property implies that any term containing fewer than $k$ unique basis vectors will be zero, which reduces the total number of terms in the sum to $(n)_k$ (the falling factorial).
Any $k$-subset (with $k$ unique basis vectors) will occur in $k!$ ways, and these are made redundant in some sense by the alternating property because
\begin{align*} \sum_{i = 1}^{k!} \omega^k(e_{i_1}, \dots, e_{i_k})_i &= \Big(\sum_{i = 1}^{k!} \sgn(\sigma_i) \Big) \omega^k(e_{j_1}, \dots, e_{j_k}) \end{align*}
where $j_1 < \dots < j_k$ (i.e. the basis vectors put in ascending order). We can therefore consider each $k$-subset only once, and there are
\begin{align*} \frac{(n)_k}{k!} &= \binom{n}{k} \end{align*}
$k$-subsets of $n$. Therefore, $\omega^k$ is uniquely determined by the $\binom{n}{k}$ values it takes on all ordered $k$-subsets of basis vectors of $V$.


For $S=\{s_1,\dots,s_k\}\subseteq\{1,2,\dots,n\}$ with $s_1<s_2<\dots<s_k$, let $E_S$ be the unique alternating multilinear map that maps $(e_{s_1},\dots,e_{s_k})$ to $1$ and $(e_{i_1},\dots,e_{i_k})$ to $0$ whenever $\{i_1,\dots,i_k\}\ne S$.
The implicit claim is that $\{E_S:\,S\subseteq \{1,2,\dots,n\},\ |S|=k\}$ is a basis for $\Lambda^k(V^*)$.