The convex hull of rotations does not contain reflections

141 Views Asked by At

$\newcommand{\SO}{\operatorname{SO}_n}$ $\newcommand{\Om}{\operatorname{O}_n^{-}}$ I saw here the following claim:

Let $\SO$ be the special orthogonal group, and let $\Om$ be the orthogonal matrices of determinant $-1$. Then the convex hull of $\SO$ does not contain any element from $\Om$.

I tried to prove this "directly", and reduced this to the following claim:

Let $A,B \in \SO$. Suppose that $$ AB^T+BA^T=2\operatorname{Id}. \tag{1} $$ Then $A=B$. Writing $R=AB^T$, this reduces to proving that if $$ R+R^T=2\operatorname{Id}, \tag{2} $$ then $R=\operatorname{Id}$.

Is there an easy elementary proof of this? I were able to show this directly for $n=2$ using the explicit formula for 2D rotation matrices. Using canonical forms, I can reduce the higher dimensional case to the two-dimensional case.

Is there a proof which avoids using canonical forms?

I can show that if $R$ satisfies equation $(2)$, then so does $R^n$ for every natural $n$, but I am not sure this helps.

2

There are 2 best solutions below

5
On BEST ANSWER

I can show $(2)$ implies $R = \operatorname{Id}$, using strict convexity of the norm. We see that, for any $x \in \Bbb{R}^n$, $$\|Rx + R^\top x + x\| = \|3x\| = \|x\| + \|x\| + \|x\| = \|Rx\| + \|R^\top x\| + \|x\|,$$ hence $Rx$, $R^\top x$, and $x$ are all non-negative scalar multiples of each other. Since all three have the same norm, they must be equal. Thus $x = Rx$. This holds true for all $x$, so $R = \operatorname{Id}$.

3
On

Enough to show that every isometric map is an extremal point in $B_1(0, L(E))$, where $E$ is your Euclidian space. For assume that $T$ is a convex combination $$T = \lambda_1 S_1 + \lambda_2 S_2$$ with $\|T(v)\|= \|v\|$ for all $v \in E$, and $\|S_i\|\le 1$, $i=1,2$. Then we must have $\|S_i(v)\| = \|v\|$ from the above ( otherwise the inequality will be strict). Since the unit ball in $E$ is strictly convex and $\|T(v)\|= \|v\| = \|S_i(v)\| = \|\lambda_1 S_1(v) + \lambda_2 S_2(v)\|$ we get $S_1(v) = S_2(v)$ for all $v$, and so $S_1= S_2$.