Let $E$ be a vector space. Then, for any pair of vectors $e_1, e_2\in E$ the following holds:
$$ e_1\wedge e_2 = - e_2\wedge e_1$$
where $\wedge$ denotes the exterior product. Now, in principle, this identity allows many calculations to be performed, since one can just swap two vectors together which changes the sign, or look for combinations that contain a repeated vector, which vanishes.
However, the tensor product is defined more generally for tensors of any rank, which can be expressed in terms of the basis of the appropriate space, i.e. expressed as a linear combination of tensor products of the basis vectors. Thus, one may wish to calculate or simplify expressions like:
$$ (e_1\wedge e_2 - 2e_1 \wedge e_1) \otimes e_2 $$
so, since linearity of both products simplifies these expressions considerably, how would one go about simplifying expressions of the type $$(a \wedge b) \otimes c$$ given that $a$, $b$ and $c$ are vectors expressed in terms of a basis of the vector space? How does one simplify mixed products including tensor and exterior products?
If $V$ and $W$ are both real (or complex) vector spaces where $V$ has basis $\{v_i\}$ and $W$ has basis $\{w_j\}$, then $V \otimes W$ has basis $\{v_i \otimes w_j\}$. There is no interaction between elements of $V$ and elements of $W$ in $V \otimes W$.
Suppose $V = \Lambda^2(E)$ where $E$ has basis $\{e_i\}$. Then $V$ has basis $\{e_i \wedge e_j\}$ for $i < j$, so when $W$ has basis $\{w_k\}$, $\Lambda^2(E) \otimes W$ has basis $\{(e_i \wedge e_j) \otimes w_k\}$. You can let $W = E$; it has no effect. What more do you want to know?