When is the standard deviation of the squares of a random number greater than the square of the standard deviation of those numbers?

44 Views Asked by At

I was playing around with the uncertainty of the kinetic energy operator in quantum mechanics, and really desired to have the following inequality be true: $$\Delta K=\frac{1}{2m}\Delta (p^2)\geq\frac{1}{2m}(\Delta p)^2\geq\frac{\hbar^2}{8m(\Delta x)^2},$$ where $\Delta Q$ is the uncertainty (standard deviation) in variable $Q$ and the last step follows from the Heisenberg uncertainty principle. The problem is, $\Delta(Q^2)\geq(\Delta Q)^2$ turns out not to be true in general. I ran some sample distributions in Mathematica, and while this inequality is true most of the time, there are exceptions, though in those cases the difference is generally slight.

Is there a condition we can apply (distribution or allowed range of $Q$) that will enforce this inequality? If not, "how often" does the inequality fail? Expanding in terms of expectation values (after squaring to get a variance on the left and not have square roots) doesn't seem to help: $$\langle Q^4\rangle\geq 2\langle Q^2\rangle(\langle Q^2\rangle-\langle Q\rangle^2)+\langle Q\rangle^4,$$ even though we know $\langle Q^4\rangle\geq \langle Q^2\rangle^2\geq \langle Q\rangle^4$.