show that that for every $x \in \mathbb{R}^n$ if $\|Ax\|_2=\|x\|_2$ then $A$ is an orthogonal matrix

186 Views Asked by At
  1. Show that for every $x \in \mathbb{R}^n$ if $\langle Ax,x\rangle=x^TAx=0$ then $A=0$
  2. By the help of $(1)$ show that that for every $x \in \mathbb{R}^n$ if $\|Ax\|_2=\|x\|_2$ then $A$ is an orthogonal matrix

What I have tried so far:

For $(1)$ I have shown that $$x^TAx=x^TA_+x=0$$ $$A_+=\frac{1}{2}(A+A^T)$$ since it holds for every $x \in \mathbb{R}^n$ we can write $$0=(x+y)^TA(x+y)=x^TAx+x^TAy+y^TAx+y^TAy=x^TAy+y^TAx=x^T(A+A^T)y $$ Then I use spectral theorem which states that for every real symmetric matrix $A$ there exists orthogonal matrix $P$ such that $$D=PAP^T \implies A=P^TDP $$ Now using this fact I can write, considering $y=Px$ $$x^TA_+x=x^TP^TDPx=(Px)^TD(Px)=y^TDy=0$$ I have come so far. Now my question is this enough for proving $A=0$? Or there are more steps to make it clear?


For $(2)$

I honestly don't exactly know how to use $(1)$ to prove this one however here is what I have tried $$<Ax,Ax>=\|x\|_2^2 \implies <x,A^TAx>=\|x\|_2^2$$

We know for norm-$2$ if $P$ is orthogonal $\|x\|_2=\|Px\|_2$ so $$<Ax,Ax>=\|Px\|_2^2 \implies <x,A^TAx>=<x,P^TPx>$$ Is this enough for proving $P=A$, I guess not and that's exactly my question.

1

There are 1 best solutions below

0
On BEST ANSWER

Everything done so far seems too complicated.

Assume $A$ is symmetric. Start with $e_i^\intercal A e_i = 0$ implies $a_{ii} = 0.$ Next, $(e_i + e_j)^\intercal A (e_i + e_j) = a_{ii} + a_{ij} + a_{ji} + a_{jj} = a_{ij} + a_{ji}.$ This implies $A$ is also antisymmetric, so $A^\intercal = A = -A$ and $A = 0.$

What to do if $A$ is not symmetric? I think the result of part 1 is false (part 2 still is true). We will still reach that $a_{ij} + a_{ji} = 0,$ so $A$ must be antisymmetric. Consider then $A = \begin{bmatrix} 0 &-1 \\ 1 &0 \end{bmatrix}.$ Then, $(x, y) A (x, y)^\intercal = (x, y) (-y, x)^\intercal = 0.$ And $A$ is not the matrix zero.

Lemma. Any linear function that preserves norm in $\mathbf{R}^d$ is an orthogonal transformation. Proof Let $A$ be such linear function, that is $\| A x\| = \|x\|.$ We may use polarization identity $$ 4(x, y) = \|x + y\|^2 - \|x - y\|^2. $$ In the right hand side we can add $A,$ and by linearity we reach $(Ax, Ay) = (x, y)$ for all vectors $x, y.$ Then $(x, y) = (Ax, Ay) = (x, A^\intercal A y)$ (this is true actually whether or not $A$ is symmetric). So that $(x, (A^\intercal A - I)y) = 0$ for all $x, y$ and now we may apply the first part to the symmetric matrix $A^\intercal A - I$ to reach the conclusion. QED