Commutation and Sharing Same Eigenvalues

463 Views Asked by At

If we have the following eigenvalues and eigenvectors $$A \psi = a\psi$$ and $$B\psi = b\psi$$ we can easily show that they have same eigenvector if they commute.

$$AB\psi=Ab\psi=bA\psi=ba\psi$$ $$BA\psi=Ba\psi=aB\psi=ab\psi$$ So they commute! They satisfy the proposal. $$[A,B]=0$$

But what if our eigenvectors are different instead of being same with corresponding eigenvalues. Can we show it again so that they have same eigenvector?

$$A\alpha_n = a_n\alpha_n$$ $$B\beta_n = b_n\beta_n$$

2

There are 2 best solutions below

7
On

I'm this as the following concern:

Basically, I am trying to show that if $[A,B]=0$, then I can pick the same set of eigenvectors since $α$ must be proportional to $β$ Is it possible?

There is a theorem which says :

If $\Omega$ and $\Lambda$ are two commuting Hermitian operators, there exists (at least) a basis of common eigenvectors that diagonalizes them both.

The main concern is a complete set of eigenbasis. Note the operator should be Hermitian so that there exists at least a basis of its orthonormal eigenvectors.

The proof for theorem in case of nondegenerate given by:

$$\Omega|\omega_i\rangle=\omega_i|\omega_i\rangle$$ $$\Lambda\Omega|\omega_i\rangle=\Lambda\omega_i|\omega_i\rangle$$ $$\Omega\Lambda|\omega_i\rangle=\omega_i\Lambda|\omega_i\rangle$$ i.e. $\Lambda|\omega_i\rangle$ is an eigenvector of $\Omega $ with eigenvalue $\omega_i$. Since this vector is unique up to a scale $$\Lambda|\omega_i\rangle=\lambda_i|\omega_i\rangle$$ Thus $|\omega_i\rangle$ is also a eigenvector of $\Lambda$ with eigenvalue $\lambda_i$.

0
On

If two operators, $\hat{A}$ and $\hat{B}$, share the same orthonormal basis, $|j\rangle$ and can be decomposed by the spectral theorem, it is fairly easy to show that the operators also commute.

We can express the eigenvalues of the decomposition as, $$ \begin{align} \hat{A}|j\rangle &=a_j|j\rangle \\ \hat{B}|j\rangle &= b_j|j\rangle. \end{align} $$

Thus the matrices when multiplied can be expressed as, $$ \begin{align} \hat{A} \hat{B} &= \sum_{j,k} a_j |j\rangle \langle j| b_k |k\rangle \langle k| \\ & = \sum_{j,k} a_j b_k |j\rangle \langle j|k \rangle \langle k| \\ &= \sum_{j,k} a_j b_k \delta_{j,k} |j\rangle \langle k| \\ &= \sum_j a_j b_j |j\rangle \langle j| \end{align} $$

Since the component eigenvalues values commute, so too do the operators. $$ \begin{align} \left[\hat{A},\hat{B}\right] & = \sum_j a_j b_j |j\rangle \langle j| - \sum_j b_j a_j |j\rangle \langle j| \\ & = \sum_j \left(a_j b_j - b_j a_j\right) |j\rangle \langle j| = 0 \end{align} $$

(Edit)

To answer your question, they will not commute unless they share the same basis. You can see this in the third line of my derivation of the operator multiplication. The expression $\langle j | k\rangle = \delta_{j,k}$ only holds if the operators share the same basis. If they don’t then you have to do standard matrix multiplication which will definitely not commute.