Does orthogonal eigenvectors imply symmetric matrix?

574 Views Asked by At

If an $n \times n$ matrix $\mathbf A$ is diagonalizable, and orthogonal eigenvectors of $\mathbf A$ form a basis of $R^n$, then is $\mathbf A$ symmetric?

Here is what I tried:

Suppose {${\vec{a_1},\vec{a_2},...,\vec{a_n}}$} is a set of eigenvectors of $A$ which forms a basis of $R^n$. Suppose $P$ is the matrix with columns ${\vec{a_1},\vec{a_2},...,\vec{a_n}}$, and $D$ is the diagonal matrix with eigenvalue corresponding to $\vec{a_i}$ on the i-th entry.

Then $A = PDP^{-1}$ and $A^{T} = {P^{-1}}^{T}D^{T}P^{T}$. $D$ is diagonal, so $D^{T} = D$. But I got stuck here.

2

There are 2 best solutions below

3
On

$A=P^tDP$, where $D$ is diagonal and $P^{-1}=P^t$. So, $A^t=(P^tDP)^t=((P^t)(DP))^t=(DP)^t(P^t)^t=(P^tD^t)P=P^tDP=A$.

0
On

Suppose $\{ x_1,x_2,\cdots,x_N \}$ is an orthonormal basis of $\mathbb{R}^N$, and suppose $A$ is a real $N\times N$ matrix such that $Ax_n = \lambda_n x_n$ for real $\lambda_n$ and all $1 \le n \le N$. Then \begin{align} Ax & = A\sum_{n=1}^{N}\langle x,x_n\rangle x_n \\ &= \sum_{n=1}^{N}\langle x,x_n\rangle \lambda_n x_n. \end{align}

Therefore, for all $x,y\in\mathbb{R}^{N}$, \begin{align} \langle Ax,y\rangle& =\left\langle\sum_{n=1}^{N}\langle x,x_n\rangle\lambda_n x_n,y\right\rangle \\ &=\sum_{n=1}^{N}\lambda_n\langle x,x_n\rangle\langle x_n,y\rangle \\ &=\left\langle x,\sum_{n=1}^{N}\lambda_n\langle y,x_n\rangle x_n\right\rangle = \langle x,Ay\rangle. \end{align} In particular, $\langle Ae_n,e_m\rangle = \langle e_n,Ae_m\rangle$ for the standard basis $\{ e_n \}$.