Given an invertible complex $n\times n$ matrix $P$, does $PP^*$ commute with with all $n\times n$ with all complex matrices.

56 Views Asked by At

The answer is a simple No as: if $P= \begin{bmatrix} \\ 1 & \\ \end{bmatrix}$ then $PP^*= \begin{bmatrix} \\ & 1 \\ \end{bmatrix}$

and $PP^* \begin{bmatrix} & 1\\ \\ \end{bmatrix} = \begin{bmatrix} 0 & 0 \\ 0 & 0 \\ \end{bmatrix}$

and $ \begin{bmatrix} & 1\\ \\ \end{bmatrix} PP^*= \begin{bmatrix} 0 & 1 \\ 0 & 0 \\ \end{bmatrix}$

But I seem to have a set of arguments which says the answer to the question is YES. Please help me find my mistake.

Fact 1: A complex vector space together with a positive definite hermitian form is hermitian

Fact 2: A change of basis to $\textbf{B'}$ form the standard basis $\textbf{E}$ is given by $\textbf{B'}=\textbf{E}P$ and this changes the standard hermitian form$(X^* Y)$ to $X'^*PP^*Y'$ where $X',Y'$ are the new co ordinate vectors. (It was assumed that $P$ is invertible)

Fact 3:for an arbitrary linear transformation $T$ on an hermitian space $<T(v)\ ,\ w>\ =\ <v\ ,\ T^*(w)>$ for all $v,w$ that are in the hermitian space

Now, let the hermitian space be the space of $n$ dimensional complex column vectors.

Given $P$ an invertible matrix we change to a basis given by $\textbf{B'}=\textbf{E}P$.

Thus $<v\ ,\ w>\ =\ X^*PP^*Y$ where $X,Y$ is the co ordinate vector in the new basis.

Now given a complex matrix $A$, let left multiplication by $A$ to column vectors written in the new basis be the linear transformation $T$ on the hermitian space.

Now translation of Fact 3 into the co ordinate vector form we have:

$<T(v)\ ,\ w>\ =\ (AX)^*PP^*(Y)\ =\ (X^*)A^*PP^*(Y)$

$<v\ ,\ T^*(w)>\ =\ (X)^*PP^*(A^*Y)\ =\ (X^*)PP^*A^*(Y)$

but since both are equal by Fact 3,

$(X^*)A^*PP^*(Y)\ = \ (X^*)PP^*A^*(Y)$

Since this is true for arbitrary $X,Y$ it implies that:

$A^*PP^*\ =\ PP^*A^*$

2

There are 2 best solutions below

4
On BEST ANSWER

When you perform your change of basis from the standard basis to the basis of $P$, as you note we have the relation $\vec{v}_s = P \vec{v}_p$, where $\vec{v}_s$ is in the standard basis and $\vec{v}_p$ is in the $P$ basis. It is also correct that given two vectors $\vec{v}_p$ and $\vec{w}_p$ in the $P$ basis, you can define the inner product is given by $\langle \vec{v}_p, \vec{w}_p \rangle_P = \vec{v}_p^* P^* P \vec{w}_p$. The problem is that the statement $$\langle A\vec{v}_p, \vec{w}_p \rangle_P = \langle \vec{v}_p, A^*\vec{w}_p \rangle_P$$ is not actually true for all transformations $A$ for our inner product. The above property is only a property of the canonical inner product $ \langle \vec{v}, \vec{w} \rangle_S = \vec{v}^* \vec{w}$ and its multiples (i.e. $\langle \vec{v}, \vec{w} \rangle = k \vec{v}^* \vec{w}$, for some positive $k$). Here the inner product isn't one of them. Note that we can deduce a similar-looking property for this inner product, namely that $$\langle A \vec{v}_p, \vec{w}_p \rangle_P = \langle \vec{v}_p, A_* \vec{w}_p\rangle$$ where $A_*$ is defined to be the matrix such that $A^* P^* P = P^* P A_*$. (This might seem trivial to you, but it's really the only property we can deduce without further information about $P$.)

On the other hand, what is true is that given two vectors $\vec{v}_s$ and $\vec{w}_s$ in the standard basis with the canonical inner product structure $\langle -, - \rangle_S$, we have for any linear transformation matrix $A$ $$\langle A \vec{v}_s, \vec{w}_s \rangle_S = \langle A P \vec{v}_p, P \vec{w}_p \rangle_S = \langle \vec{v}_s, A^* \vec{w}_s \rangle_S = \langle P \vec{v}_p, A^* P \vec{w}_p \rangle_S$$ But you will notice that this statement only trivially implies $P^* A^* P = P^* A^* P$.

ADDENDUM: Here are answers to your questions:

  1. What is meant by $\langle -, - \rangle_S$ and $\langle -, - \rangle_P$? They are just two inner products: $$\langle \vec{v}, \vec{w} \rangle_S = \vec{v}^* \vec{w}, ~ \langle \vec{v}, \vec{w} \rangle_P = \vec{v}^* P^* P \vec{w}$$ for some arbitrary fixed matrix $P$. While these formulas don't need a basis to make sense, when associating an inner product with a vector space, you have to specify a basis. What I mean to emphasize is the following. If $\vec{v}$ and $\vec{w}$ are two vectors, with $\vec{v}_s$ and $\vec{w}_s$ as their representations in the standard basis, and $\vec{v}_p$ and $\vec{w}_p$ as their representations in the $P$ basis, the the canonical inner product $\langle -, - \rangle_S$ between $\vec{v}_p$ and $\vec{w}_p$ is equivalent to $\langle \vec{v}_s, \vec{w}_s \rangle_P$. This is because $\vec{v}_s = P \vec{v}_p$, and similarly for $\vec{w}$.

  2. Should the line $$\langle A \vec{v}_s, \vec{w}_s \rangle_S = \langle A P \vec{v}_p, P \vec{w}_p \rangle_S = \langle \vec{v}_s, A^* \vec{w}_s \rangle_S = \langle P \vec{v}_p, A^* P \vec{w}_p \rangle_S$$ instead read as $$\langle A \vec{v}_s, \vec{w}_s \rangle_S = \langle A P \vec{v}_p, P \vec{w}_p \rangle_P = \langle \vec{v}_s, A^* \vec{w}_s \rangle_S = \langle P \vec{v}_p, A^* P \vec{w}_p \rangle_P ~?$$ No, it should not; let's break it down. The first equality just uses the fact that $\vec{v}_s = P \vec{v}_p$, and likewise for $\vec{w}_s$. What you suggest would claim that $$\langle A \vec{v}_s, \vec{w}_s \rangle_S = \langle A P \vec{v}_p, P \vec{w}_p \rangle_P = (\vec{v}_p^* P^* A^*) P^* P (P \vec{w}_p)$$ which isn't true. Remember, the crucial reason the logic you presented fails is because $$\langle A \vec{v}, \vec{w} \rangle_P \neq \langle \vec{v}, A^* \vec{w} \rangle_P$$ in general.

0
On

The adjoint operator $T^*$, is defined as the adjoint of the matrix of the operator $T$, wrt to an orthonormal basis.

In other words: If $A$ is the matrix of $T$ wrt a basis which is not orthonormal, then $A^*$ may not be the matrix of $T^*$ wrt to this basis.

But I have assumed in my derivation[the above question] that if $A$ is the matrix of an operator with respect to an arbitrary basis then, $A^*$ is the matrix of the adjoint operator wrt to that arbitrary basis; Which is Not necessarily true.

Since the above is true if the basis is orthonormal, the above is true if the change of basis is from an orthonormal basis to another. This means $P$ should be unitary which implies $PP^*$ to be $I$, the identity matrix, and it is no surprise that the identity matrix commutes with all complex matrices of the same size.