I've been playing with the definition of the cross product and am trying to grasp the atomic algebraic assumptions needed to define the unique cross product. I remember seeing a post that was saying that the cross product is necessarily the only vector valued vector multiplication that satisfies the distributive property with scalar multiplication and addition (1)-(2) below and some other simple assumptions. One such requirement would be that the product is orthogonal to both arguments (4) below. In trying to find these assumptions I have arrived at the following investigation.
The cross product is defined as an operation $\times : \mathbf{R^3}\times\mathbf{R^3}\rightarrow\mathbf{R^3}$ with the following algebraic properties.
(1) $c\mathbf{v}\times\mathbf{w} = \mathbf{v}\times c\mathbf{w} = c(\mathbf{v}\times\mathbf{w})$
(2a) $(\mathbf{v} + \mathbf{u})\times \mathbf{w} = \mathbf{v}\times \mathbf{w} + \mathbf{u} + \mathbf{w}$
(2b) $\mathbf{v} \times (\mathbf{u} + \mathbf{w}) = \mathbf{v}\times \mathbf{u} + \mathbf{v} + \mathbf{w}$
(3) $\mathbf{v} \times \mathbf{w} = -(\mathbf{w} \times \mathbf{v})$.
With these properties along with the assumptions
(i) $\hat{i}\times \hat{j} = \hat{k}$
(ii) $\hat{j}\times \hat{k} = \hat{i}$
(iii) $\hat{k}\times \hat{i} = \hat{j}$
we can derive the definition for such a product by computing $\mathbf{v} \times \mathbf{w} = (v_1 \hat{i} + v_2 \hat{j} + v_3 \hat{k}) \times (w_1 \hat{i} + w_2 \hat{j} + w_3 \hat{k})$.
My question is, is it possible to replace rule (3) with $\mathbf{v} \times \mathbf{w} = \mathbf{w} \times \mathbf{v}$ and assume only (i) $\hat{i}\times \hat{j} = \hat{k}$ to define a consistent multiplication? It seems like this shouldn't work but I haven't been able to find a contradiction yet.
An alternate question is: can we assume (1)-(2) with (i) along with
(4) $\mathbf{v} \cdot(\mathbf{v}\times \mathbf{w}) = \mathbf{w} \cdot(\mathbf{v}\times \mathbf{w}) =\mathbf{0}$
and derive (3) by contradiction?
No, if you require $v\times w=w\times v$, then even specifying (i)-(iii) alone is not enough to completely define the multiplication. So, if you specify (i) alone, then you definitely have too little.
Your properties (1),(2a),(2b) together say that $\times$ is a bilinear map $\Bbb{R}^3\times\Bbb{R}^3\to\Bbb{R}^3$. So, let us first understand linear and bilinear maps more carefully first. Say you have vector spaces $X,Y,Z$ over the same field $\Bbb{F}$ with $X$ and $Y$ having finite dimension, say $n,m$ respectively (in the above example, $\Bbb{F}=\Bbb{R}$ is the field of real numbers and $n=m=3$ and $X=Y=Z=\Bbb{R}^3$). Now, in order to completely specify a bilinear mapping $T:X\times Y\to Z$, it suffices to fix a basis $\alpha=\{v_1,\dots, v_n\}$ for $V$ and a basis $\beta=\{w_1,\dots, w_m\}$ for $W$, and to specify $T(v_i,w_j)$ for all $i,j$ (see here for the proof of this claim in the case of linear maps; I leave it to you to prove the analogue for bilinear, and more generally, multilinear maps).
So, in order to fully specify a bilinear map $T:X\times Y\to Z$, you have to specify a total of $nm$ pieces of information (i.e the values $T(v_i,w_j)\in Z$ for all $i,j$).
Now, let us come to the special case where $X=Y$, so we’re interested in bilinear maps $T:X\times X\to Z$. Then of course one has to specify a total of $n^2$ pieces of information. There are two special cases of interest: