Decomposition of hermitian matrix as difference of positive semidefinite matrices

356 Views Asked by At

In my reference, Box 11.2, Page 512, Chapter 11, Entropy and Information, Quantum Computation and Quantum Information by Nielsen and Chuang, proof of the Fannes' inequality contains

p1

here $\rho$ and $\sigma$ are positive semidefinite(hermitian with nonzero eigenvalues) with eigenvalues lies in the interval $[0,1]$(constitute a probability distribution).

In another document, Theorem 10.2, Lecture 10: Continuity of von Neumann entropy; quantum relative entropy, John Watrous' Lecture notes, it is given that

p2

How can we prove that any hermitian operator can be decomposed in terms of the difference between two positive semidefinite operators?


Here, $A=VD_AV^\dagger$ and $B=UD_BU^\dagger$ need not commute (not simultaneously diagonalized).

The difference between two hermitian matrices is hermitian, ie., $H=A-B$ is hermitian.

$H=A-B=WD_HW^\dagger$, and we are free to choose diagonal matrices $D_p$ and $D_q$ such that $D_H=D_p-D_q$, i.e., $H=WD_HW^\dagger=W(D_p-D_q)W^\dagger=WD_pW^\dagger-WD_qW^\dagger=P-Q$

It would be helpful if one could point in the right direction.

2

There are 2 best solutions below

2
On BEST ANSWER

Since $\rho - \sigma$ is Hermitian, you can diagonalize it and all its eigenvalues fall between $[-2,2]$ by the triangle inequality. Let $Q$ be the diagonal matrix whose eigenvalues are exactly the non-negative eigenvalues of $\rho - \sigma$ and let $R$ be the diagonal matrix whose eigenvalues are the negatives of the negative eigenvalues of $\rho - \sigma$. So, both $Q$ and $R$ are positive and if you've ordered the eigenvalues right, $Q-R = \rho-\sigma$ after a suitable unitary change of basis. Note that $Q$ and $R$ will have orthogonal support since $Q$ only supports the positive eigenspace and $R$ only supports the negative eigenspace, and these are obviously orthogonal (no eigenvalue is both negative and positive).

0
On

Let $A$ be a generic Hermitian operator. Given an orthonormal basis for the space made with its eigenvectors, and the associated eigenvalues, we can always decompose it as $$A = \sum_{k=1}^n \lambda_k P_k$$ where $P_k\equiv |u_k\rangle\!\langle u_k|$ and $|u_k\rangle$ is the eigenvector of $A$ corresponding to the eigenvalue $\lambda_k$. Because $A$ is Hermitian we have $\lambda_k\in\mathbb{R}$.

Now just consider the terms in this decomposition corresponding to positive and negative eigenvalues. There will be some subsets of $[n]\equiv \{1,..., n\}$, written with $S_\pm$, containing the indices associated to positive and negative eigenvalues, respectively. Thus $$A = \underbrace{\sum_{k\in S_+} \lambda_k P_k}_{\equiv A_+} + \underbrace{\sum_{k\in S_-} \lambda_k P_k}_{\equiv -A_-}.$$ Here I just defined $A_\pm$ to equal the two terms in the equation as written. The operator $A_+$ is clearly positive semidefinite, as it's Hermitian with non-negative eigenvalues. But $A_-$ is positive semidefinite for the same reason, considering that $\lambda_k<0$ for $k\in S_-$. It follows that $A=A_+-A_-$ with $A_\pm \ge 0$.