Help in understanding something about the uncertainty principle.

45 Views Asked by At

According to Robertson formula, for arbitrary two observables $X$ and $Z$ the uncertainty relation takes the following form: $$ \Delta X\Delta Z \ge {1 \over 2}\left| {\left\langle \psi \right|[X,Z]\left| \psi \right\rangle } \right|. $$ It was mentioned in "10.1007/s11128-018-2100-x" that:

However, the aforementioned deviation is not always optimal when evaluating the magnitude of the uncertainty, since the bounds for the deviation are state-dependent, which would give rise to a trivial result if the commutator related to the observables has zero expectation value.

Lots of research reported similar things, for example:

Nevertheless, it is obvious that the lower bound of the deviation is state-dependent, which might result in a trivial result if the systematic state is associated with one of the eigenvectors of X and Z. "10.1007/s11128-019-2196-7"

I want help understanding what is meant here.

1

There are 1 best solutions below

1
On BEST ANSWER

On the right-hand side of the inequality you have something which is dependent on $\psi$, the pure state of the system. In the references you provided, they are simply claiming that if $\psi$ is an eigenvector of both $X$ and $Z$, then the RHS will take the following form: $$\frac12 \langle \psi| (XZ - ZX)| \psi\rangle = \frac12 \big[\langle \psi| XZ| \psi\rangle - \langle \psi| ZX| \psi\rangle \big] = \frac12 \big[xz\cdot1 - zx \cdot 1 \big] = 0$$ Where $x,z \in R$, and $X | \psi\rangle = x | \psi\rangle$ and $Z | \psi\rangle = z | \psi\rangle$.

So if $X$ and $Z$ have a common eigenvector, in that state, $X$ and $Z$ have a zero lowerbound. Furthermore if $X$ and $Z$ have the same eigenvectors, than they can theoretically be simultaneously measured to an arbitrary precision.