Let $f\colon \mathbb{R}^3 \to \mathbb{R}^3$ be a linear function satisfying $$\langle x, f(x) \rangle = 0$$ for all $x \in \mathbb{R}^3$. I want to prove (and hopefully not disprove) that there exists a vector $F \in \mathbb{R}^3$ s.t. $$f(v) = F \times v$$
The problem is equivalent to $F = v\times f(v)$ being constant. I've tried showing it's derivative is zero but I haven't being lucky in this approach.
I've always been mathematically skeptic as to why cross products appear anywhere at all in nature, and an affirmative answer to this question would quench my thirst for a mathematical explanation. It would be even more satisfying if it's a "coordinate free" proof, without going down to messy calculations with the partial derivatives of $f$ and so on..
A proof by banging on coordinates
I use the standard basis $e_1, e_2, e_3$ and the convention $e_1 \times e_2 = e_3$. Let $A$ be the matrix for $f$ in this basis, so $A_{ij} = \langle e_i, f(e_j) \rangle$. The condition $\langle x, f(x) \rangle = 0$ immediately implies that $A_{ii} = \langle e_i, f(e_i) \rangle = 0$, and furthermore that the matrix $A$ is antisymmetric ($A^T = -A$), since $$0 = \langle e_i + e_j, f(e_i) + f(e_j) \rangle = \langle e_i, f(e_j) \rangle + \langle e_j, f(e_i) \rangle$$ Hence the matrix of $A$ looks like this: $$ A = \begin{pmatrix} 0 & -c & b \\ c & 0 & -a \\ -b & a & 0 \end{pmatrix} $$ which is the same linear operator as $g(v) = (a, b, c) \times v$. There seems to be really nothing special going on here except the fact that $f$ leads to an antisymmetric matrix in the basis $e_1, e_2, e_3$.
A (retrospective) explanation
Using the $f$ you started with, you can construct a bilinear form $B_f(v, w) = \langle v, f(w) \rangle$ on $\mathbb{R}^3$, and the condition $\langle x, f(x) \rangle = 0$ is equivalent to requiring that this form be antisymmetric ($B_f(v, w) = -B_f(w, v)$). Write $\operatorname{Alt}(\mathbb{R}^3, \mathbb{R})$ for the vector space of antisymmetric bilinear forms on $\mathbb{R}^3$.
There is a linear map $\Phi: \mathbb{R}^3 \to \operatorname{Alt}(\mathbb{R}^3, \mathbb{R})$, taking a vector $F \in \mathbb{R}^3$ to the bilinear form $B_{F \times (-)}$, meaning the bilinear form $B_{F \times (-)}(v, w) = \langle v, F \times w \rangle$. If $F \neq 0$, then $B_{F \times (-)}$ is nondegenerate and hence nonzero, so the $\Phi$ is an injective map. By counting dimensions, we see that $\Phi$ is an isomorphism, and so every alternating bilinear form on $\mathbb{R}^3$ is $B_{F \times (-)}$ for some $F \in \mathbb{R}^3$.
So we have that $B_f = B_{F \times (-)}$ for some $F \in \mathbb{R}^3$, and by a similar argument as in the last paragraph, we have that $f = F \times (-)$.