On matrices with zero von Neumann entropy

227 Views Asked by At

I was indecisive about whether to post this problem in the Physics forum or in the Mathematics one. However, since I am mostly interested in the mathematical understanding of it, I am posting it here.


Suppose I have a matrix $A$ subject to the conditions (for its trace):

$$\mbox{Tr}(A) = 1$$ $$A^2 = A$$ $$A \succeq 0 $$ $$A^H = A$$

Now from this conditions I need to prove the following:

$$-\mbox{Tr} (A \ln{A}) = 0$$

In physics, $A$ can be seen as a pure density matrix and $-\mbox{Tr}(A\ln{A})$ is the von Neumann entropy. How does that follow from the first four conditions?

2

There are 2 best solutions below

6
On BEST ANSWER

This is a work for the spectral theorem for symmetric matrices. Let's summon it.

Spectral theorem for real symmetric matrices. If $A=(A_{ij})_{n\times n}$ is a symmetric matrix with real inputs then:

  1. the matrix $ A $ has, less than multiplicity, $ n$ real eigenvalues $\lambda_1\leq \ldots \leq \lambda_n$;

  2. the matrix $ A $ has $ n$ eigenvectors $q_1,\ldots q_n\in \mathbb{R}^n$ that form an orthonormal basis of $\mathbb{R}^n$. That is, if $i\neq j$ we have $q_i\cdot q_j=0$ and $q_i\cdot q_i=1$.

\ 3. the matrix $ A $ admits the spectral decomposition $$ A= Q^T\Lambda Q $$ where $Q$ is an orthonormal matrix (meaning that $Q^T = Q^{-1}$) whose columns are the eigenvectors of $A$ ( meaning that $Q=[\,q_1\,|\ldots\,|\,q_n\,]$ ) and $ \Lambda $ is a diagonal matrix whose diagonal elements are the eigenvalues of $A$ (meaning that $\Lambda=\mbox{diag}(\lambda_1,\ldots,\lambda_n)$).

Let's work out the hypotheses of the problem one by one.

  • By $A^T=A$ we have $ A=Q^TAQ $ where $Q$ and $\Lambda$ are as in the spectral theorem stated above.

  • By the definition of symmetric matrix $A\succeq 0$ it means that $ 0\leq \lambda_1,\ldots, 0\leq \lambda_n $

  • $A^2=A$ implies \begin{align} A^2=A\Leftrightarrow & (Q^{-1}\Lambda Q)(Q^{-1}\Lambda Q)=(Q^{-1}\Lambda Q) \\ \Leftrightarrow & Q^{-1}\Lambda \Lambda Q=Q^{-1}\Lambda Q \\ \Leftrightarrow & \Lambda \Lambda Q=\Lambda Q \\ \Leftrightarrow & \Lambda \Lambda =\Lambda \end{align} And with a little more algebraic manipulation it is possible to prove that $$ A^m=A \quad \mbox{ and } \quad \Lambda^m=\Lambda, \quad \mbox{ for all } m\in\mathbb{N} $$

  • Once for all three matrices $ U,V,W\in\mathbb{R}^{n\times n}$ we have $\mbox{trace}(U\cdot V\cdot W)=\mbox{trace}(V\cdot W\cdot U)$ we have $$ \mbox{trace}(A)=\mbox{trace}(Q^{-1}\Lambda Q)= \mbox{trace}(\Lambda QQ^{-1})=\mbox{trace}(\Lambda) =\lambda_1+\ldots+\lambda_{n}=1 $$


Once $\mbox{trace}(A^m)=\mbox{trace}(A)=\mbox{trace}(\Lambda)=\mbox{trace}(\Lambda^m)=1$ is easy to see $$ \lambda_1^m+\ldots+\lambda_n^m=1\quad \mbox{ for all } m\in\mathbb{N}-\{0\} $$ And since $0\leq \lambda_1\leq \ldots \leq \lambda_{n}$ this is only possible if $$ \lambda_1=\ldots=\lambda_{n-1}=0 \mbox{ and } \lambda_n=1 $$ Recall that if $U$ and $V$ are two symmetric matrices whose eigenvalues are $\lambda_1(U),\ldots, \lambda_n(U)$ and $\lambda_1(V),\ldots, \lambda_n(V)$ then $$ \mbox{trace}(U\cdot V)= \lambda_1(U)\cdot \lambda_1(V)+ \ldots +\lambda_n(U)\cdot \lambda_n(V) $$ Now, notice that \begin{align} \mbox{trace}(A\ln A) =& \lambda_1(A)\cdot \lambda_1(\ln A)+ \ldots +\lambda_n(A)\cdot \lambda_n(\ln A) \\ =& 0\cdot \lambda_1(\ln A)+ \ldots +0\cdot \lambda_{n-1}(\ln A)+1\cdot \lambda_n(\ln A) \\ =& \lambda_n(\ln A) \end{align} Now use the fact that $\mbox{trace}(e^X)=e^{\mbox{trace}(X)}$ and the identities $$ e^{\ln A^m}=A^m=A \quad \mbox{ and }\quad \ln (A^m)=\ln (A) $$ to get the desired result.

4
On

Following Darij Grinberg in the comments we may assume WLOG that $A$ is diagonal. From $A^2 = A$ we then see that these diagonal entries are all either $1$ or $0$. You can fill in the rest from there