A convex combination of unitary transforms converts any matrix to identity

693 Views Asked by At

Question

Show that there exists a set of unitary matrices $\{U_i\}$, and probability $\{p_i\}$, such that for any $n \times n$ matrix $A$ \begin{equation} \tag{1} \sum_{i} p_i U_i A U^{\dagger}_i = \text{tr}(A) \frac{I}{n} \end{equation}

Attempts

For $n=2$, it is easy to show \begin{equation} \frac{1}{4} ( \sigma^x A \sigma^x + \sigma^y A \sigma^y + \sigma^z A \sigma^z + I A I ) = \text{tr}(A) I / 2 \end{equation} where $\sigma^{x,y,z}$ are Pauli sigma matrices. The idea comes from kraus operator sum representation.

We can then generalize to dimension $n = 2^m$, where $U_i$ can be taken as the tensor products of these basis, but not arbitrary dimension.

In indices, Eq.(1) is equivalent to \begin{equation} \sum_i p_i (U_i)_{ab} (U_i^*)_{dc} = \delta_{bc} \delta_{ad} / n \end{equation} This looks like the identity from the finite dimensional irreducible unitary representation of finite group, see Peter-Weyl theorem. But again this only works when group $G$ has irreducible representation at dimension $n$, and all $p_i$ in this case are equal.

I feel that "right proof" should not utilize these additional structures.

2

There are 2 best solutions below

12
On BEST ANSWER

An attempted proof of existence that doesn't actually construct the spanning set and distribution.

First, we note that the set of unitary matrices spans $\Bbb C^{n \times n}$; we could prove this nicely using polar decomposition. From there, we note that there must exist a basis of $\Bbb C^{n \times n}$ $\{U_1,U_2,\dots,U_{n^2}\}$ consisting of unitary matrices.

It follows that the vectors $\operatorname{vec}(U_1),\dots,\operatorname{vec}(U_{n^2})$ span $\Bbb C^{n^2}$.

The argument below is incorrect

(Thus, there necessarily exist (positive) $p_k$ such that $$ \frac 1n I_{n^2} = \sum_{i} p_i \operatorname{vec}(U_i)\operatorname{vec}(U_i)^\dagger $$ We correspondingly find that these $U_i$ satisfy $\sum_{i} p_i U_iA U_i^\dagger = \frac 1n \operatorname{tr}(A) I$, as desired.)


Some clarification:

First of all, the linear span bit. Let $\langle \cdot, \cdot \rangle$ denote the Frobenius (Hilbert-Schmidt) inner product. Suppose that $A$ lies in the orthogonal complement of the span of the unitary matrices. Let $A = UP$ be a polar decomposition. Then we have $$ 0 = \langle U, A \rangle = \operatorname{trace}(U^\dagger A) = \operatorname{trace}(U^\dagger UP) = \operatorname{trace}(P) $$ but $P$ is positive semidefinite, so $\operatorname{trace}(P) = 0$ implies that $P = 0$. Thus, $A$ must be zero.

So, the span of the unitary matrices is all $\Bbb C^{n \times n}$.


Another obsrevation:

Let $\mathcal C_U$ denote the convex cone generated by the set $\{uu^* : u = \operatorname{vec}(U) \text{ for some unitary } U \}$. Showing that $\sum_{i} p_i \operatorname{vec}(U_i)\operatorname{vec}(U_i) = I$ can be achieved with non-negative coefficients $p_i$ means that we're trying to show that $I \in \mathcal C_U$.


One orthogonal basis for $\Bbb C^{n \times n}$ consisting of unitary matrices is as follows: let $$ X = \pmatrix{0&&&&1\\1&0\\&1&0\\&&\ddots\\&&&1}, Z = \pmatrix{1\\ & \omega \\ && \ddots \\ &&& \omega^{n-1}} $$ Then the matrices $\{Z^j X^k : 0 \leq j,k \leq n-1\}$ form our orthogonal basis.

0
On

I feel this is cheating a bit and there should be a more interesting answer, but this had a representation theory flavor to me and I believe the question as asked is answered. First of all, note that $tr(A)=tr\left(\sum_i p_i U_iAU_i^\dagger\right)$ for any unitaries $U_i$. Second, note when the $\{U_i\}$ are the set of representation matrices of a representation of a group $G$ and the $p_i = \frac{1}{|G|}$, that the group average $\sum_i p_i U_iAU_i^\dagger$ is invariant under conjugation by the representation: for every $j$ $$U_j\left(\sum_i p_i U_iAU_i^\dagger\right)U_j^\dagger=\sum_i p_i U_iAU_i^\dagger.$$ Let $G$ be the group of signed permutations, i.e. the group generated by the subgroup of $n\times n$ permutation matrices and the subgroup of diagonal matrices with $\pm 1$'s on the diagonal. These matrices are all unitary. The permutation matrices are generated by pairwise transpositions; conjugating by one of these switches a pair of rows and the corresponding pair of columns, from this it follows that a matrix invariant under conjugation by permutations must have a constant diagonal and constant off-diagonal. Similarly, conjugation by a diagonal matrix with $1$'s and a single $-1$ on the diagonal will leave the diagonal unchanged but change the sign of the off diagonal elements in the corresponding row and column. In particular a matrix invariant under this diagonal subgroup must have zero off diagonal entries. Taken together the only matrices invariant under both subgroups, hence the whole group, are multiples of the identity matrix. The result follows.