I understand how $(0, 1)$ and $(1, 0)$ tensors are linear functional and vectors, respectively. But I can't see how $(1, 1)$ tensor can represent linear transformations, and how $(1, 2)$ tensors can represent the cross product. I have shown below how I 'derived' that linear functionals and vectors are $(0, 1)$ and $(1, 0)$ tensors. How can I do the same for linear transformations and the cross product?
(0, 1) Tensor \begin{array}{rll} T: & V & \longrightarrow \mathbb{R} \\ & v & \longmapsto \psi(v) \end{array}
So $\psi \in V^\ast$, that is, $\psi$ is a linear functional.
(1, 0) Tensor \begin{array}{rll} T: & V^\ast & \longrightarrow \mathbb{R} \\ & \psi & \longmapsto v(\psi) = \psi(v) \end{array}
So $v \in V$, that is, $v$ is a vector.
(1, 1) Tensor \begin{array}{rll} T: & V^\ast \times V & \longrightarrow \mathbb{R} \\ & (\psi, v) & \longmapsto ??? \end{array}
???, is a linear transformation.
(1, 2) Tensor \begin{array}{rll} T: & V^\ast \times V \times V& \longrightarrow \mathbb{R} \\ & (\psi, v, w) & \longmapsto ??? \end{array}
???, is a cross product.
For the $(1,1)$ tensor: if $\Phi:V \to V$ is a linear transformation, then we can take $$ T: (\psi,v) \mapsto \psi(\Phi(v)) $$ I would not say that every $(1,2)$ tensor is a cross-product, but the cross product is an example of such a tensor. In particular, we have the map $$ T: (\psi,v,w) \mapsto \psi(v \times w) $$ which satisfies the requirements of a $(1,2)$ tensor.