In high school physics and chemistry classes, we were told that entropy is a measure of disorder in a physical system. For example, molecules that are relatively stationary correspond to a lower entropy whereas molecules that are moving around correspond to a higher entropy.
But in statistics and probability, entropy is written using a different formula:
$$ H(X) = - \sum_{i=1}^{n} p(x_i)\log p(x_i). $$
- Is there any relationship between the interpretation of entropy in probability vs. physics/chemistry?
- In probability, is entropy also describing some type of "disorder"?
Thank you!
Is there any relationship between the interpretation of entropy in probability vs. physics/chemistry? Ans. Yes. They are the same.
In probability, is entropy also describing some type of "disorder"? Ans. My impression is that it is not usual to think this way, but instead about uncertainty.
The Entropies taught in physics and chemistry is the Clausius Entropy: dS = dQ/T and the Gibb's Entropy which is mathematically the same as Shannon's Entropy. Gibb's Entropy is in fact also the same as the Clausius Entropy.
Most often entropy is interpreted as follows. "The entropy of a physical system or a random variable is a measure of the degree of uncertainty of 'the state of such a system' or 'variable'."