On the Wikipedia article on the cross product is says that a vector $a$ which is itself a cross product (that is $a=c\times d$), can be represented in the expression $a \times b$ for some other vector $b$ by the matrix $$[a]_{\times}=\begin{bmatrix} 0 & -a_3 & a_2 \\ a_3 & 0 & -a_1 \\ -a_2 & a_1 & 0 \end{bmatrix}$$ where $a\times b = [a]_{\times} b = (dc^T - cd^T)b = (c\times d)\times b$.
I'm wondering if this matrix representation of the cross product of a vector encodes any of the other properties of the cross product.
I don't see a way of extracting a scalar from it to reproduce the triple scalar product.
It's determinant and trace are always $0$.
$[a]_{\times}^T=-a$
It's Frobenius norm is $\sqrt{a_2^2+a_3^2+a_1^2+a_3^2+a_1^2+a_2^2} = \sqrt{2}\sqrt{a_1^2+a_2^2+a_3^2}=\sqrt{2}\|a\|$. So up to a factor of $\sqrt{2}$, the Frobenius norm is the same as the norm of $a$.
A vector orthogonal to $c$ and $d$ (finding such a vector is one of the main purposes of the cross product) will be a member of the kernel of this matrix. $\operatorname{Ker}[a]_{\times}=t\begin{bmatrix} a_1 \\ a_2 \\ a_3\end{bmatrix}$
This means that the plane spanned by $c$ and $d$ is the row space of $[a]_{\times}$ because $\operatorname{Ker}(A)^{\bot} = \operatorname{Row}(A)$ for all matrices $A$. And because this matrix equals its negative transpose, the plane is also given by the column space of $[a]_{\times}$.
The mapping $[a]_{\times} \mapsto a$ is linear, so if someone could come up with an explicit expression for this mapping, then we could expression the triple scalar product in terms of that mapping. As in, let $T$ be the name of that mapping, then $T(a)\cdot b = (c\times d) \cdot b$, though I admit this is cheating. Even so, it'd be useful to have an explicit expression for $T$.
Are there any other useful properties of this matrix?
Given a vector $a$, you can construct a linear map $A$ such that, for any vector $b$, $A(b) = a \times b$. The components of $A$ with respect to some orthogonal basis will give you a matrix representation like the one you wrote. This in no way requires you to write $a$ as the cross product of two other vectors.
Given that $A(b) = a \times b$, you can find the triple scalar product of $a, b, u$ for some vector $u$ by $A(b) \cdot u = (a \times b) \cdot u$. In matrix language, this is represented by multiplying $u^T A b$.
In higher-dimensional spaces, this linear map might be said to correspond to a bivector, or an antisymmetric 2-form. In the most general sense, an antisymmetric 2-form is merely a linear map that is a function of two vectors, antisymmetric on interchange. If $\beta$ is such a map and $u, v$ are vectors, then $\beta(u,v) = -\beta(v,u)$.
When you have a metric (as Euclidean space does), you can identify $\beta(u,v) = u \cdot B(v)$. This gives a linear operator $B$--mapping vectors to vectors--that is most analogous to the map $A$ above.
Edit:
Such maps allow us to consider generalizations of the cross product beyond three dimensions. In this context, bivectors such those corresponding to $\beta$ or $B$ correspond to planar subspaces rather than vectors. Geometrically, this map $B$ takes a vector argument and returns a vector in the associated planar subspace that is orthogonal to the argument.
To see the connection between this idea and the cross product, consider the plane orthogonal to $a$.