In my attempt to gain some sort of understanding of tensor products (of vector spaces), and looking at the corresponding nLab entry (or Wikipedia's), I tried to consider a very basic example. Unfortunately, even there, I'm stuck.
Let $V=W=U=\mathbb{R}^2$ (with the standard bases).
Let $B : V \times W \to U$ be bilinear.
Let $B_\otimes : V\otimes W \to U$ be linear and such that $B_\otimes(v\otimes w) = B(v,w)$ for all $v\in V$ amd $w\in W$.
\begin{align*} v=\begin{bmatrix} v_1 \\ v_2 \end{bmatrix},\ w=\begin{bmatrix} w_1 \\ w_2 \end{bmatrix},\quad & B(v,w) = \begin{bmatrix} v_1 + w_2 \\ v_2 - w_1 \end{bmatrix}. \text{Edit: Not bilinear} \\ v \otimes w = \begin{bmatrix} v_1w_1 & v_1w_2 \\ v_2w_1 & v_2w_2 \end{bmatrix},\quad & B_\otimes(v \otimes w) = {?} \end{align*}
How could I extract, say, $v_1$, using only some linear combination of $v_iw_j$'s? Or is this not what is being asked by this example? How else can I explicitly write $B(v\otimes w)$?
I hope at least that what I already wrote of the example is not grossly unfounded.
Any 'other' simple and instructive examples of tensors / tensor products of vector spaces would, of course, also be much appreciated.