Let $V$ be a vector space over $\mathbb{R}$ and let ${\Lambda}_k(V)$ be the set of all the multilinear-alternating functions from $\prod_{i = 1}^k V$ to $\mathbb{R}$. We are going to suppose that $k \leq n = \dim V$ and $\{v_1 , \ldots , v_n\}$ and $\{v_1^* , \ldots , v_n^*\}$ are basis for $V$ and $V^*$ respectively. Fix $f \in {\Lambda}_k(V)$ and I want to show that $$ f = \sum_{1\leq i_1 < \ldots < i_k\leq n} {\lambda}_{i_1 , \ldots , i_k} \left(\bigwedge_{j = 1}^k v_{i_j}^*\right)\mbox{,} $$ being ${\lambda}_{i_1 , \ldots , i_k} = f(v_{i_1} , \ldots , v_{i_k})$, which implies that ${\Lambda}_k(V)$ is spanned by $$ \mathcal{B} = {\left\{\bigwedge_{j = 1}^k v_{i_j}\right\}}_{1\leq i_1 < \ldots < i_k\leq n}\mbox{.} $$
In http://www.maths.adelaide.edu.au/michael.murray/dg_hons/node25.html can be found that it is suggested to develop the right member of last equation and it's my attemp about. Fixed $u_j \in V$, $k = 1 , \ldots , k$, I am trying to prove that $$ \sum_{1\leq i_1 < \ldots < i_k\leq n} {\lambda}_{i_1 , \ldots , i_k} \left(\bigwedge_{j = 1}^k v_{i_j}^*\right)(u_1 , \ldots , u_k) = f(u_1 , \ldots , u_k)\mbox{.} $$ Well, at first we need to express $u_k$ through the vectors $v_i$, $i = 1 , \ldots , n$. We are going to suppose that $$ u_j = \sum_{i = 1}^n {\lambda}_i^j v_i \quad \mbox{ for all } \quad j = 1 , \ldots , k\mbox{.} $$ Therefore, $$ \sum_{{i_1 , \ldots , i_k \in \{1 , \ldots , n\}} \atop {i_1 < \ldots < i_k}} f(v_{i_1} , \ldots , v_{i_k}) \left(\bigwedge_{l = 1}^k v_{i_l}^*\right)(u_1 , \ldots , u_k) = $$ $$ = \sum_{{i_1 , \ldots , i_k \in \{1 , \ldots , n\}} \atop {i_1 < \ldots < i_k}} f(v_{i_1} , \ldots , v_{i_k}) \left(\sum_{j_1 , \ldots , j_k \in \{1 , \ldots , n\}} \left(\prod_{l = 1}^k {\lambda}_{j_l}^l\right) \left(\bigwedge_{l = 1}^k v_{i_l}^*\right)(u_1 , \ldots , u_k)\right) = $$ $$ = \sum_{{i_1 , \ldots , i_k \in \{1 , \ldots , n\}} \atop {i_1 < \ldots < i_k}} f(v_{i_1} , \ldots , v_{i_k}) \left(\sum_{j_1 , \ldots , j_k \in \{1 , \ldots , n\}} \left(\prod_{l = 1}^k {\lambda}_{j_l}^l\right) \left(\sum_{\sigma \in G_{j_1 , \ldots , j_k}} sgn(\sigma) \left(\prod_{l = 1}^k {\delta}_{i_l , \sigma(j_l)}\right)\right)\right) = $$ $$ = \sum_{{i_1 , \ldots , i_k \in \{1 , \ldots , n\}} \atop {i_1 < \ldots < i_k}} f(v_{i_1} , \ldots , v_{i_k}) \left(\sum_{j_1 , \ldots , j_k \in \{1 , \ldots , n\} \atop \sigma \in S_k} sgn(\sigma)\left(\prod_{l = 1}^k {\lambda}_{j_l}^l {\delta}_{i_l , j_{\sigma(l)}}\right)\right) $$ $$ = \sum_{{i_1 , \ldots , i_k \in \{1 , \ldots , n\}} \atop {i_1 < \ldots < i_k}} \left(\sum_{{j_1 , \ldots , j_k \in \{1 , \ldots , n\} \atop \sigma \in S_k} \atop {j_{\sigma(l)} = i_l}} sgn(\sigma) f(v_{i_1} , \ldots , v_{i_k}) \left(\prod_{l = 1}^k {\lambda}_{j_l}^l\right)\right) = $$ $$ = \left(\begin{matrix} n \\ k \end{matrix}\right) \sum_{{j_1 , \ldots , j_k \in \{1 , \ldots , n\} \atop \sigma \in S_k} \atop {j_{\sigma(1)} < \ldots < j_{\sigma(k)}}} f(v_{j_1} , \ldots , v_{j_k})\left(\prod_{l = 1}^k {\lambda}_{j_l}^l\right) = $$ $$ = \left(\begin{matrix} n \\ k \end{matrix}\right) \sum_{{j_1 , \ldots , j_k \in \{1 , \ldots , n\} \atop \sigma \in S_k} \atop {j_{\sigma(1)} < \ldots < j_{\sigma(k)}}} f\left(\sum_{i = 1}^n {\lambda}_i^1 v_i , \ldots , \sum_{i = 1}^n {\lambda}_i^k v_i\right) = \left(\begin{matrix} n \\ k \end{matrix}\right) f(u_1 , \ldots , u_k)\mbox{,} $$ where $G_{j_1 , \ldots , j_k} \cong S_k$, being $\psi : S_k \to G_{j_1 , \ldots , j_k}$, given by $\psi : l \mapsto j_l$ an isomorphism ($G_{j_1 , \ldots , j_k}$ is essentially $S_k$). I guess that many steps are not fine, like the last. Can you help me to finish and correct my mistakes? Thank you very much.
Let $_n\mathbf{P}_k$ be the set of all ordered subsets with size $k$ of $\{1, \ldots, n\}$. Also, Let $_n\mathbf{C}_k$ be the set of all increasing ordered subsets with size $k$ of $\{1, \ldots, n\}$.
Now, phrased in more succinct terms, we want to show
$$f = \underbrace{\sum_{s \,\in\, _n\mathbf{C}_k} \lambda_s \left(\bigwedge_{i \in s} \mathbf{e}^*_i \right)}_{\mathrm{RHS}}$$
where
$$\lambda_s = f\left(\prod_{i \in s} \mathbf{e}_i\right)$$
Now $f$ is multilinear, and the $\mathrm{RHS}$ is a sum of multilinear functions, so both sides are multilinear. Thus, it suffices to show that both sides are equal for each basis element of $\prod^{k}_{i=1}V$ (or at least, part of the basis of $\prod^{k}_{i=1}V$, since most of the basis maps trivially to $0$). That is, for all $p \in \, _n\mathbf{P}_k$:
$$f\left(\prod_{i \in p} \mathbf{e}_i\right) = \sum_{s \,\in\, _n\mathbf{C}_k} \lambda_s \left(\bigwedge_{i \in s} \mathbf{e}^*_i \right)\left(\prod_{i \in p} \mathbf{e}_i\right)$$
For only one $s_0 \in\, _n\mathbf{C}_k$, there is a permutation $\sigma$ such that $\sigma(p) = s_0$. Then by the alternating nature of the wedge sum of the dual vectors,
$$\left(\bigwedge_{i \in s_0} \mathbf{e}^*_i \right)\left(\prod_{i \in p} \mathbf{e}_i\right) = \operatorname{sgn}(\sigma) \left(\bigwedge_{i \in s_0} \mathbf{e}^*_i \right)\left(\prod_{i \in s_0} \mathbf{e}_i\right)$$ and $$f\left(\prod_{i \in p} \mathbf{e}_i\right) = \operatorname{sgn}(\sigma)f\left(\prod_{i \in s_0} \mathbf{e}_i\right)$$
Otherwise, if $s \neq s_0$, by definition(by your defintion, it follows from induction),
$$\left(\bigwedge_{i \in s} \mathbf{e}^*_i \right)\left(\prod_{i \in p} \mathbf{e}_i\right) = 0$$
Furthermore, by definition(same deal with the induction thingy),
$$\left(\bigwedge_{i \in s} \mathbf{e}^*_i \right)\left(\prod_{i \in s} \mathbf{e}_i\right) = 1$$
Putting this all together gives the desired result.