Assuming we are in a $3$ dimensional oriented Euclidean vector space $V$ with a scalar product $\iota$, I want to show that:
A linear map $f:V\rightarrow V$ is anti self adjoint, i.e $f^*=-f$ if and only if there exists a vector $a \in V$ with $f(x) = a \times x$ for all $x \in V$.
I was not quite sure how one can prove this but this is how I started for the direction from left to right:
Let $f^* = -f$. Then, if we choose an orthonormal basis $B=(b_1,b_2,b_3)$, the coordinate matrix of $f$ w.r.t $B$ is skew symmetric. If we then check what the vectors $b_i$ get mapped to, one can easily calculated that the $a$ w.r.t $B$ we are searching for must be $$\langle a^*,B\rangle = \left( \begin{array}{c} -d_{23} \\ d_{13} \\ -d_{12} \end{array} \right) $$ if the skew symmetric matrix of $f$ w.r.t $B$ is:
$$\langle B^*,f(B)\rangle = \left( \begin{array}{ccc} d_{11} & d_{12} & d_{13} \\ -d_{12} & d_{22} & d_{23} \\ -d_{13} & -d_{23} & d_{33} \end{array} \right)$$
The problem with this direction of the prove is, I look at the matrix above, then calculate what this matrix does to an arbitrary $x$ and then I see that if the diagonal elements of the above elements are zero, I can choose $a$ like above, so that $f(x) = a \times x$. My question is: How do I justify that the diagonal of the matrix above must be zero? It must somehow follow from $f$ being anti self adjoint, but I am not sure how.
Any help would be greatly appreciated!
I think your "anti-self-adjoint" condition means that $f(u)\cdot v= -u\cdot f(v)$. If your standard orthonormal basis is $e_1,e_2,e_3$ I think your $d_{ij}=f(e_i)\cdot e_j$. Then $d_{ii}=f(e_i)\cdot e_i=-e_i\cdot f(e_i)=-f(e_i)\cdot e_i=-d_{ii}$ so $d_{ii}=0$.