An $n \times n$ matrix with real coefficients is a Gramian matrix if and only if $A$ is symmetric and all its eigenvalues are nonnegative

176 Views Asked by At

An $n\times n$ real matrix $A$ is said to be a Gramian matrix if there exists a real (square) matrix $B$ such that $A=B^tB$. Prove that $A$ is a Gramian matrix if and only if $A$ is symmetric and all of its eigenvalues are nonnegative. Hint: Apply Theorem 6.17 to $T=L_A$ ($T(x)=Ax$ for all $x\in \mathbb{R}^n$) to obtain an orthonormal basis $\{v_1,\ldots,v_n\}$ of eigenvectors with associated eigenvalues $\lambda_1,\ldots,\lambda_n$. Define the linear operator $U$ by $U(v_i)=\sqrt{\lambda_i}v_i$.

Theorem 6.17: Let $T$ be a linear operator on a finite-dimensional real inner product space $V$. Then T is self-adjoint (Hermitian) if and only if there exists an orthonormal basis $\beta$ for $V$ consisting of eigenvector of $T$.

I got this exercise from Friedberg's Linear Algebra. The forward direction was relatively easy for me, but I'm having trouble proving the converse. Here is what I have so far:

Suppose that A is symmetric and all of its eigenvalues are nonnegative. If we let $T=L_A$, we have that $$[T^*]_\gamma=([T]_\gamma)^*=A^*=A^t=A=[T]_\gamma$$ where $\gamma$ denotes the standard ordered basis for $\mathbb{R}^n$, $[T]_\gamma$ denotes the matrix representation of $T$ with respect to the basis $\gamma$, $T^*$ denotes the adjoint of $T$, and $A^*$ denotes the conjugate transpose of $A$.

This implies that $T^*=T$ and hence $T$ is self adjoint so we can use Theorem 6.17 to obtain an orthonormal basis $\beta=\{v_1,\ldots,v_n\}$ for $\mathbb{R}^n$ consisting of eigenvectors of $T$ with corresponding eigenvalues $\lambda_1,\ldots,\lambda_n$.

Define $U:\mathbb{R}^n\to \mathbb{R}^n$ by $U(v_i)=\sqrt{\lambda_i}v_i$ for $i=1,2,\ldots,n$. We only need to define its action on the basis because a linear transformation is uniquely determined by how it acts upon a basis. $U$ is also well-defined because the eigenvalues of $A$ are nonnegative, so we may take their square roots.

Note the following: $$T(v_i)=\lambda_iv_i = \sqrt{\lambda_i}^2v_i= \sqrt{\lambda_i}U(v_i) =U(\sqrt{\lambda_i}v_i)=U^2(v_i)$$ Therefore we have that $T=U^2$ because linear transformations are uniquely determined by their action on a basis.

Let $B=[U]_\beta$. We now have $$A=[T]_\beta=[U^2]_\beta=[U]_\beta[U]_\beta=B^2$$ Since A is symmetric, we also have $$B^2=A=A^t=(B^2)^t=(B^t)^2$$ This is as far as I've gotten. If I could find a way to prove that $B=B^t$ I would be able to finish up the proof, but I'm not sure if I should even be going through this route because the definition of a Gramian matrix does not imply that the matrix $B$ is symmetric. I'd really appreciate it if you could give any ideas or feedback.

1

There are 1 best solutions below

0
On

I managed the solve the question. I'll continue the solution that I posted originally.

We can view $\mathbb{R}^n$ as an inner product space using the standard inner product $\langle \cdot, \cdot \rangle$. Any vector $x \in \mathbb{R}^n$ can be represented as the following: $$x = \sum\limits_{i=1}^n\langle x,v_i\rangle v_i$$ Note that we are using the orthonormal basis $\beta=\{v_1,\ldots,v_n\}$ consisting of eigenvectors of $T$ from earlier. We now have that $U(x)=U(\sum\limits_{i=1}^n\langle x,v_i\rangle v_i)=\sum\limits_{i=1}^n\langle x,v_i\rangle U(v_i)=\sum\limits_{i=1}^n\langle x,v_i\rangle \sqrt{\lambda_i}v_i$.

Hence for any $x,y\in\mathbb{R}^n$ $$\langle U(x),y\rangle = \langle \sum\limits_{i=1}^n\langle x,v_i\rangle \sqrt{\lambda_i}v_i,y\rangle = \sum\limits_{i=1}^n\langle x,v_i\rangle\sqrt{\lambda_i}\langle v_i,y\rangle = \sum\limits_{i=1}^n \langle x,\overline{\sqrt{\lambda_i}\langle v_i,y\rangle}v_i \rangle= \sum\limits_{i=1}^n \langle x,\overline{\langle v_i,y\rangle}\sqrt{\lambda_i}v_i \rangle= \sum\limits_{i=1}^n \langle x,\langle y,v_i\rangle U(v_i) \rangle= \langle x, \sum\limits_{i=1}^n\langle y,v_i\rangle U(v_i)\rangle= \langle x, U(\sum\limits_{i=1}^n\langle y,v_i\rangle v_i)\rangle= \langle x,U(y)\rangle$$ So we can conclude that $U^*=U$ and therefore $B^t=B^*=[U]_\beta^*=[U^*]_\beta=[U]_\beta=B$.

In conclusion, we have $A=B^2=B^tB$, so $A$ is a Gramian matrix.