Is a smooth map whose differential lies in a coset of the orthogonal group affine?

104 Views Asked by At

Let $f:\mathbb{R}^d \to \mathbb{R}^d$ be a smooth map, and suppose there exist a matrix $C \in M_d$ such that $df(x) \in O(d)\cdot C$ for every $x \in \mathbb{R}^n$ ($O(d)$ is the orthogonal group).

Is it true $f$ must be affine?

When $C$ is invertible the answer is positive:

Define $h=f \circ C^{-1}$. Then $dh=df \circ C^{-1} \in O(d)$, so $h$ is a (Riemannian) isometry, hence affine. (so $f$ is also affine).

2

There are 2 best solutions below

3
On BEST ANSWER

I do not think so. Take $\hat{f} : \mathbb{R}^2 \to \mathbb{R}^3$ parametrizing the cylinder embedded in 3D: \begin{align} \hat{x}_1 &= x_1\\ \hat{x}_2 &= \cos(x_2)\\ \hat{x}_3 &= \sin(x_2)\\ \end{align} Let $P : \mathbb{R}^3 \to \mathbb{R}^2$ be the projector $$ P : (x_1, x_2, x_3) \mapsto (x_1, x_2)$$ and form the map $$f : \mathbb{R}^3 \to \mathbb{R}^3$$

$$f(x) = \hat{f}\big(P \, x\big)$$ Then $$df(x) = \begin{pmatrix} 1 &0 & 0\\ 0 &-\sin(x_2) & 0\\ 0 & \,\,\,\,\, \cos(x_2) & 0 \end{pmatrix} = \begin{pmatrix} 1 &0 & 0\\ 0 &-\sin(x_2) & \cos(x_2)\\ 0 & \,\,\,\,\, \cos(x_2) & \sin(x_2) \end{pmatrix} \begin{pmatrix} 1 &0 & 0\\ 0 &1 & 0\\ 0 &0 & 0 \end{pmatrix}$$

However, such map seems to arise from an isometric immersion. Take the kernel of $C$, call it $V = \ker(C)$. Take its orthogonal complement $W \perp V$. Set $\dim(V) = d_0$ and $\dim(W) = d_1 = d-d_0$. Let $C_0$ be the matrix formed by vector columns that are orthonormal basis of $V$. Then $C\, C_0 = 0$ and $\text{rank}(C_0) = d_0$. Then $\mathbb{R}^{d_0}$ acts as a group on $\mathbb{R}^d$ by $x \mapsto x + C_0 v$ for $v \in \mathbb{R}^{d_0}$. One can check that $f\big(x + C_0v\big) = f(x)$. Therefore, there exists a smooth map $\hat{f} : W \to \mathbb{R}^d$ such that $f = \hat{f}\circ P_W$, where $P_W : \mathbb{R}^d \to W$ is the orthogonal projector. Therefore, on $W$ we can define $\hat{f}\circ \Big(C|_W\Big)^{-1} : W \mapsto \mathbb{R}^d$ is an isometric immersion. At least it seems to me. I didn't check everything in detail, but maybe this can be helpful even if I have made some mistake somewhere.

1
On

I am elaborating on Futurologist's idea on the "source" of these maps:

Proposition:

Let $f:\mathbb{R}^d \to \mathbb{R}^d$ be a smooth map. Then, the following two statements are equivalent:

  1. There exist a matrix $C \in M_d$ such that $df(x) \in O(d)\cdot C$ for every $x \in \mathbb{R}^n$.

  2. There exist a subspace $W \subseteq \mathbb{R}^d$, such that $f$ can be factored as follows:

$f=g \circ A$, where $A:\mathbb{R}^d \to W$ is linear, and $g:W \to \mathbb{R}^d$ is a smooth isometric immersion (not necessarily linear).

In fact it can be shown that $A=T\circ P_W$ where $P_W:\mathbb{R}^d \to W$ is the orthogonal projection on $W$, and $T:W \to W$ is an automorphism of $W$. (In particular, the restriction $A|_W:W \to W$ is invertible).

Proof:

We start with $(1) \Rightarrow (2)$:

Suppose $df \in O(d)C$, and that $C$ is symmetric (We will show at the end this is not a real restriction, i.e We can assume this W.L.O.G). Then we can write $df_x=Q_xC$, $Q_x \in O(d)$.

Define $V = \ker(C),W=V^{\perp}$. ($W$ is the orthogonal complement of $V$). Set $\dim(V) = d_0$ and $\dim(W) = d_1 = d-d_0$.

Let $C_0$ be the matrix formed by vector columns that are orthonormal basis of $V$ ($C_0$ is a $d \times d_0$ matrix). Then $C\, C_0 = 0$ and $\text{rank}(C_0) = d_0$.

Lemma 1: $\operatorname{Image}C_0=V$.

Proof of lemma 1:

Clearly $\operatorname{Image}C_0 \subseteq \ker(C)=V$. Now use equality of dimensions.

$\mathbb{R}^{d_0}$ acts as a group on $\mathbb{R}^d$ by $x \mapsto x + C_0 v$ for $v \in \mathbb{R}^{d_0}$.

Lemma 2: $f\big(x + C_0v\big) = f(x) \, \,$ for every $x \in\mathbb{R}^d \, , \, v \in \mathbb{R}^{d_0}$.

Proof of the lemma 2:

Fix $x \in \mathbb{R}^d$. Consider the map $h:\mathbb{R}^{d_0} \to \mathbb{R}^{d}$, defined by $h(v)=f\big(x + C_0v\big)$.

Then $dh_v=df_{x + C_0v} \cdot C_0=Q_{x + C_0v}\cdot CC_0=0$. This implies $h$ is constant. In particular, $h(v)=h(0)$ for all $v$.

Define $\, \hat{f} : W \to \mathbb{R}^d$ by $\hat f(w)=f(w)$. Then, for every $x \in \mathbb{R}^d$, $f(x) = \hat{f}\circ P_W(x)$, where $P_W : \mathbb{R}^d \to W$ is the orthogonal projection on $W$.

Indeed, set $x=v+v^\perp$, $v \in V,v^\perp \in W$. Then $$ f(x)=f(v+v^\perp)=f(v^\perp)=\hat f(v^\perp)=\hat{f}\circ P_W(x)$$ (the equality follows frome lemmas 1,2).

Since $C$ is symmetric, $\operatorname{Image}(C)=(\ker C)^\perp=V^\perp=W$. Since $V=\ker C$, we get $$ W=\operatorname{Image}(C)=\operatorname{Image}(C|_W),$$

Consider the injective map $C|_W:W \to W$. Finally, define $g=\hat{f}\circ \Big(C|_W\Big)^{-1} : W \mapsto \mathbb{R}^d$.

We claim $g$ is an isometric immersion.

Indeed,

$$ QC=df=d\hat f \circ P_W \Rightarrow C^TC=df^Tdf=(P_W)^T(d\hat f)^T d\hat f \circ P_W \tag{1}.$$

Restricting this equality to $W$ (and recalling $C|_W:W \to W$,$(P_W)|_W=Id_W$, $(P_W)^T=i_{W \to \mathbb{R}^d}$ is the inclusion map*), we get

$$ (C|_W)^T C|_W=(C^T)|_W C|_W = (d\hat f)^T d\hat f \tag{2}.$$

Equality $(2)$ implies

$$ dg^Tdg=\big( (C|_W)^{-1} \big)^T (d \hat{f})^T d \hat{f}(C|_W)^{-1}=Id_W, $$

as required.

Finally, note that $$f = \hat{f}\circ P_W=\hat{f} \circ (C|_W)^{-1} \circ (C|_W P_W)=g \circ (C|_W P_W).$$

Denoting $A=(C|_W P_W):\mathbb{R}^d \to W,$ we see that $f$ is factored as stated in the claim.


First, we show $df \in O(d)C$ for some constant $C \in M_d$ implies $df \in O(d)\tilde C$ for some symmetric $\tilde C$. (In fact $\tilde C$ can be chosen to be symmetric positive-definite).

Indeed, let $df_x=Q_xC=\tilde Q_x \tilde C_x$ where $\tilde Q_x \in O(d)$ and $\tilde C_x$ is symmetric-positive definite (i.e $\tilde Q_x \tilde C_x$ is a polar decomposition of $df_x$). Then

$$C^TC=df_x^Tdf_x=\tilde C_x^2,$$ so the uniqeness of the positive semi-definite square root implies $ \tilde C_x= \sqrt{C^TC}$ which is constant.

Now, we show $(2) \Rightarrow (1)$:

Suppose $f=g \circ A$. Then $df=dg \cdot A, df^T=A^T(dg)^T$, so

$$ df^Tdf=A^TA.$$

Let $df_x=Q_xC_x$ be a polar decomposition of $df_x$. Then:

$$C_x^2 =A^TA \Rightarrow C_x=\sqrt{A^TA},$$ so $C_x=C$ is constant.