$\times: \mathbb{R}^3 \times \mathbb{R^3} \to \mathbb{R}^3$ cross product. Find all linear operators $T: \mathbb{R^3} \to \mathbb{R}^3$ such that:
$$\|T(u) \times T(v)\|=\|u \times v\|$$
Can anyone give a hint to start, I can not leave the place.
$\times: \mathbb{R}^3 \times \mathbb{R^3} \to \mathbb{R}^3$ cross product. Find all linear operators $T: \mathbb{R^3} \to \mathbb{R}^3$ such that:
$$\|T(u) \times T(v)\|=\|u \times v\|$$
Can anyone give a hint to start, I can not leave the place.
On
Coincidentally, someone asked a very similar question yesterday. Since that question is more general and its formulation is somewhat differently from yours, I will give a differently phrased answer below.
By the given condition and Lagrange's identity, we have \begin{aligned} \det\left[\pmatrix{u^\top\\ v^\top}T^\top T\pmatrix{u&v}\right] &=\det\left[\pmatrix{(Tu)^\top\\ (Tv)^\top}\pmatrix{Tu&Tv}\right]\\ &=\|T(u) \times T(v)\|^2\\ &=\|u \times v\|^2\\ &=\det\left[\pmatrix{u^\top\\ v^\top}\pmatrix{u&v}\right] \end{aligned} for every $u,v\in\mathbb R^3$. Since $T^\top T$ is positive semidefinite, its eigenvalues $\lambda_1,\lambda_2,\lambda_3$ are nonnegative. So, if $u$ and $v$ are two orthonormal eigenvectors of $T^\top T$, the above equality implies that $\lambda_i\lambda_j=1$ for every $i\ne j$. Hence either $\lambda_1=\lambda_2=\lambda_3=1$. In turn $T^\top T=I$, i.e. $T$ is real orthogonal.
Conversely, if $T$ is real orthogonal, then \begin{aligned} \|T(u) \times T(v)\|^2 &=\det\left[\pmatrix{(Tu)^\top\\ (Tv)^\top}\pmatrix{Tu&Tv}\right]\\ &=\det\left[\pmatrix{u^\top\\ v^\top}T^\top T\pmatrix{u&v}\right]\\ &=\det\left[\pmatrix{u^\top\\ v^\top}\pmatrix{u&v}\right]\\ &=\|u \times v\|^2. \end{aligned}
On
If the norm is the one induced by Cartesian dot product, you can write the norm in Cartesian components:
$$||u\times v||=\sqrt{\epsilon_{ijk}u_jv_k\epsilon_{imn}u_mv_n}$$
Then using the identity $$\epsilon_{ijk}\epsilon_{imn}=\delta_{jm}\delta_{kn}-\delta_{jn}\delta_{km}$$
We have $$||u\times v||=\sqrt{u_iu_iv_jv_j-u_iv_iu_jv_j}$$
If the transformation is linear $$T(u)_i=R_{ij}u_j$$ so
$$||T(u)\times T(u)||=\sqrt{R_{ik}R_{im}R_{jp}R_{jq}(u_ku_mv_pv_q-u_kv_mu_pv_q)}$$
For this to be equal to $||u\times v||$ it must be $R_{ik}R_{im}=\pm\delta_{km}$, which I think can be visualized as $O(n)\times\mathbb Z_4$, as any orthogonal matrix multiplied by $\pm1,\pm i$ would do the job.
Edit: considering that the application is from $\mathbb R^3$ in itself only $O(n)$ is enough
Notice that a singular $T$ does not satisfy the equation for all $u$ and $v$: As there is a $u_0$ with $Tu_0 = 0$ and $\|u_0\| = 1$. Extend $u_0$ to an orthonormal basis $\{u_0, v, w\}$ and you obtain that $$ \|Tu_0 \times Tv\| = 0 \ne 1 = \|u_0 \times v\|.$$ Thus, $T$ must be invertible.
For an invertible $T$ you have following identity (cf wikipedia, or proof on page 11): $$ Tu\times Tv = (\det T)(T^{-1})^t(u\times v).$$
By cycling $u\times v$ through $e_1,e_2,e_3$, you obtain that $M = (\det T)(T^{-1})^t$ is orthogonal. In particular we have $$ \pm 1 = \det M = (\det T)^3 (\det T)^{-1} = (\det T)^2. $$ That is, $T$ itself needs to be an orthogonal matrix.
Now, assume $T$ to be an orthogonal matrix. Then, for every $u,v$ we have $$ \|Tu \times Tv\| = \|\pm 1 (T^t)^t (u \times v)\| = \|u\times v\|. $$ That is, it is also sufficient that $T$ is orthogonal.