If $(Av,Au)=(v,u)$ then matrix $A$ is orthogonal

986 Views Asked by At

Let $A\in M_{n \times n}(\Bbb R)$ and suppose that for every $u, v \in \Bbb R^{n}$ $$(Av,Au) = (v,u)$$ where $(\cdot,\cdot)$ is the standard inner product on $\Bbb R^{n}$. Prove $A$ is an orthogonal matrix.


I wasn't able to solve it, I got to the point

$\left(Av,Au\right)=\left(Av\right)^{T}Au=v^{T}A^{T}Au$ and $\left(v,u\right)=v^{T}u$

$\left(Av,Au\right)=\left(v,u\right)\ \ \ ➜\ \ \ v^{T}A^{T}Au\ =\ v^{T}u$

and now I'm stuck.. I don't know if I can conclude that $A^{T}A=I_{R^{n}}$ just by the last equation, and if not how to get to the point I can show it..

Now I have two questions, first the official solution is this

Let $u = e_{i}$ and $v = e_{j}$ . Then $(a_{i} , a_{j} ) = (Ae_{i} , Ae_{j} ) = (e_{i} , e_{j} ) = δ_{i,j}$ . Thus, the columns of $A$ are orthonormal, so $A$ is orthogonal.

This is super unclear.. what is $a_{i},a_{j}$? what is $δ_{i,j}$? I understand that $(e_{i},e_{j})=0$ if $i \ne j$ and 1 otherwise, but why can we choose $v,u$ if it's a for every claim? If someone can explain to be the logic behind the solution I'd be grateful.

the second question is if there is another way to solve it?

5

There are 5 best solutions below

0
On

$e_i$ is an orthogonal basis. $a_i=Ae_i$ for each $i$. $\delta_{i,j}$ is called the kroneckerdelta function and is simply $1$ if $i=j$ and $0$ if $i\neq j$.

0
On

$\delta_{i,j}$ is the Kronecker delta. Defined by $\delta_{i,j}=1$ if $i=j$ and $\delta_{i,j}=0$ otherwise as you noticed.

You can chose $u,v$ as you like as the identity

$$v^TA^TAu=v^Tu$$ is valid for any vectors $u,v$. In particular if you chose them among the vectors of the canonical basis $\{e_i\}$. $a_i$ seems just to be a notation for $A.e_i$.

The logic behind the solution, often used in linear algebra, is that if something is true for any vector, it is in particular true for the vectors of a basis. This is exactly what is done here for ordered pairs of vectors.

0
On

I can complete your solution. Fix $u$. From $v^TA^TAu = v^Tu$, we have $$ v^T(A^TA - I)u = 0, $$ or $(v, (A^TA - I)u) = 0$ for every vector $v$. It is easy to see that for a given vector $a$, if $(a,b) = 0$ for every $b$, then $a=0$ (compute $(a,e_j)$ for $j=1,\dots,n$). It follows that $$ (A^TA-I)u = 0 $$ for every vector $u$ since $u$ is arbitrary. But this means that the rank of $A^TA-I$ is $0$, in order words, $A^TA - I = 0$.

0
On

$\delta_{i,j}$ is the kronecker delta which is equal to $0$ for $i\neq j$ and is $1$ when $i=j$. You can also say that $\delta _{i,j}$ is the (i,j)th element of the identity matrix $I$. $a_i $ is the vector of $\mathbb{R}^n$ taking as elements the i-th column of $A$. That is: j-th component of the vector $a_i$ is $(a_i)_j= a_{ji}$ where $a_{ji}$ is the $(j,i)$-th element of the matrix A. We have $(Ae_i)_j =\sum_k a_{jk}(e_i)^k= a_{ji} =(a_i)_j$. So $Ae_i=a_i$

Now, from your equations and the definition of the inner product, we have $(a_i,a_j)= \sum_k a_{ki} a_{kj} = \delta_{i,j}$. Again, $$( A^T A)_{ij} = \sum_k (A^T)_{ik}a_{kj} = \sum_k a_{ki} a_{kj} =\delta_{i,j}$$ Therefore, $A^TA=I$ Which proves what you needed.

0
On

Yes, in answer to your second question. Here's another way to prove it, which avoids using $\ \delta_{i,j}\ $ and the $\ e_i\ $ (at least directly—they're really still there, lurking in the shadows).

You've deduced that $$ v^TA^TAu=v^Tu $$ for all $\ u,v\in\mathbb{R}^n\ $, which you can write as $$ v^T\big(A^TA-I\big)u=0\ . $$ If you put $\ v=\big(A^TA-I\big)u\ $ in this identity, then you get $$ 0=\big(\big(A^TA-I\big)u\big)^T\big(A^TA-I\big)u=\big\|\big(A^TA-I\big)u\big\|^2\ , $$ which implies that $\ \big(A^TA-I\big)u=0\ $. This must hold for all $\ u\in\mathbb{R}^n\ $, which implies that $\ A^TA-I=0\ $, or, equivalently, $\ A^TA=I\ $, making $\ A\ $ orthogonal.