How to calculate norm of matrix using any orthogonal basis?

188 Views Asked by At

Show that for $X \in \mathrm{M}_n(\mathbb{C})$ and any orthonormal basis $\{u_1, \ldots , u_n\}$ of $\mathbb{C}^n$, we have $$\|X\|^2=\sum_{j,k}^n|\langle u_j,Xu_k\rangle |^2.$$

My Attempt:

I thought this to prove using projection formula as

to find coordinate of $X_{ij}=\langle u_i,Xu_j\rangle$.

But I was thinking I am missing something. As this is $10$ mark question with just one line argument How it was ?

Please give me hint .I wanted to solve this problem

Any Help will be appreciated

2

There are 2 best solutions below

0
On

Define a unitary map by $Ue_k = u_k$ for every $k=1,2,\ldots,n$, where $\{e_1,e_2,\ldots, e_n\}$ is the standard basis of $\Bbb C^n$. Then, we have $$\begin{align*} \sum_{j=1}^n\sum_{k=1}^n\left|\langle u_j,Xu_k\rangle\right|^2&=\sum_{j=1}^n\sum_{k=1}^n\left|\langle Ue_j,XUe_k\rangle\right|^2\\&=\sum_{j=1}^n\sum_{k=1}^n\left|\langle e_j,U^*XUe_k\rangle\right|^2\\&=\|U^*XU\|^2\\&=\text{tr}(U^*X^*UU^*XU)\\&=\text{tr}(U^*X^*XU)\\&=\text{tr}(X^*XUU^*)\\&=\text{tr}(X^*X)=\|X\|^2. \end{align*}$$

As a different approach, we can use Parseval's identity $$\begin{align*} \sum_{k=1}^n\sum_{j=1}^n\left|\langle u_j,Xu_k\rangle\right|^2 &=\sum_{k=1}^n\|Xu_k\|^2\\&=\sum_{k=1}^n\sum_{j=1}^n\left|\langle e_j,Xu_k\rangle\right|^2 \\&=\sum_{j=1}^n\sum_{k=1}^n\left|\langle X^*e_j,u_k\rangle\right|^2 \\&=\sum_{j=1}^n\sum_{k=1}^n\|X^*e_j\|^2 \\&=\sum_{j=1}^n\sum_{k=1}^n\left|\langle X^*e_j,e_k\rangle\right|^2 \\&=\sum_{j=1}^n\sum_{k=1}^n\left|\langle e_j,Xe_k\rangle\right|^2 \\&=\sum_{j=1}^n\sum_{k=1}^n\left|X_{jk}\right|^2 =\|X\|^2. \end{align*}$$

0
On

In this answer we look at a vector of $n$ coordinates as a column matrix. That is, a matrix with $n$ rows and $1$ column. First note that identity $$ \left\|X\right\|^2=\sum_{j=1}^n\sum_{k=1}^n|\langle u_j,Xu_k\rangle |^2\qquad (\ast) $$ is valid when the basis $\{u_1,\ldots,u_n\}$ of $\mathbb{C}^n$ is the canonical basis $\{e_1,\ldots,e_n\}$ of $\mathbb{C}^n$. Let any orthonormal basis $\{u_1,\ldots,u_n\}$ of $\mathbb{C}^n$. Let $U$ be the matrix whose columns are $u_1,\ldots, u_n$. That is, $$ U=\left[\; u_1 | \ldots |u_j|\ldots |u_n\;\right] $$ Note that $$ X\cdot U= X\cdot \left[\; u_1 | \ldots |u_j|\ldots |u_n\;\right]= \left[\; X\cdot u_1 | \ldots |X\cdot u_\ell|\ldots |X\cdot u_n\;\right] $$ and $$ U^\ast X\cdot U= U^\ast\left[\; X\cdot u_1 | \ldots |X\cdot u_j|\ldots |X\cdot u_n\;\right]= \big[\langle u_i, Xu_j \rangle \big]_{n\times n} $$ Then $$ \left\| U^\ast X U \right\|^2 = \left\| \big[\langle u_i, Xu_j \rangle \big]_{n\times n} \right\|^2 = \sum_{i=1}^{n}\sum_{j=1}^{n} |\langle u_i, Xu_j \rangle| ^2 $$ If $U=(U_{ij})_{n\times n}$ and $U^\ast=(U^\ast_{ij})_{n\times n}$ then $UU^\ast=(U_{ij})_{n\times n}\cdot(U_{ij})_{n\times n}=(\sum_{k=1}^{n}U_{ik}U^\ast_{kj})_{n\times n}=I_{n\times n}$. More explicitly, $$ \sum_{k=1}^{n}U_{ik}U^\ast_{kj}=\begin{cases} 1 & \mbox{if } i=j\\ 0 & \mbox{if } i\neq j\end{cases} $$ Now we have \begin{align} \left\|U^\ast \cdot X \cdot U\right\|^2 =& \left\| (U^\ast_{ij})_{n\times n}\cdot (X_{k\ell})_{n\times n}\cdot (U_{pq})_{n\times n} \right\|^2 \\ =& \left\| (\sum_{\alpha=1}^nU^\ast_{i\alpha}\cdot X_{\alpha\ell})_{n\times n}\cdot (U_{pq})_{n\times n} \right\|^2 \\ =& \left\| (\sum_{\beta=1}^{n}\sum_{\alpha=1}^nU^\ast_{i\alpha}\cdot X_{\alpha\beta}\cdot U_{\beta q})_{n\times n} \right\|^2 \\ =& \sum_{i=1}^{n}\sum_{q=1}^{n} \sum_{\beta=1}^{n}\sum_{\alpha=1}^n U^\ast_{i\alpha}\cdot X_{\alpha\beta}\cdot U_{\beta q} \cdot \overline{U^\ast_{i\alpha}\cdot X_{\alpha\beta}\cdot U_{\beta q}} \\ =& \sum_{\beta=1}^{n}\sum_{\alpha=1}^n X_{\alpha\beta}\cdot \overline{X_{\alpha\beta}} \sum_{i=1}^{n}\sum_{q=1}^{n} U_{\beta q} \cdot U^\ast_{i\alpha}\overline{U_{\beta q} \cdot U^\ast_{i\alpha}} \\ =& \sum_{\beta=1}^{n}\sum_{\alpha=1}^n X_{\alpha\beta}\cdot \overline{X_{\alpha\beta}} \Big( \sum_{i=q}^{n} U_{\beta q} \cdot U^\ast_{i\alpha}\overline{U_{\beta q} \cdot U^\ast_{i\alpha}} + \sum_{1\leq q<i\leq n}^{n} U_{\beta q} \cdot U^\ast_{i\alpha}\overline{U_{\beta q} \cdot U^\ast_{i\alpha}} \\ &\hspace{9cm}+ \sum_{1\leq i<q\leq n}^{n} U_{\beta q} \cdot U^\ast_{i\alpha}\overline{U_{\beta q} \cdot U^\ast_{i\alpha}} \Big) \\ =& \sum_{\beta=1}^{n}\sum_{\alpha=1}^n X_{\alpha\beta}\cdot \overline{X_{\alpha\beta}} \Big( 1 + 0 + 0 \Big) \\ =& \sum_{\beta=1}^{n}\sum_{\alpha=1}^n X_{\alpha\beta}\overline{X_{\alpha\beta}} \\ =& \left\|X\right\|^2 \end{align}