What is the relation between Bayes' theorem and Gibbs distribution.

149 Views Asked by At

This question refers to the link: https://en.wikipedia.org/wiki/Principle_of_maximum_entropy specifically to the last sentence of the page:

"For the case of given average values as testable information (averaged over the sought after probability distribution), the sought after distribution is formally the Gibbs (or Boltzmann) distribution the parameters of which must be solved for in order to achieve minimum cross entropy and satisfy the given testable information."

Even after several attempt, I could not understand what exactly it means. Can anyone please help?

1

There are 1 best solutions below

3
On BEST ANSWER

It is saying that if you use averages (which is a particular example) leading to a particular Gibbs distribution, the so-called "cross-entropy" reaches its maximum precisely for that distribution in the same way as if you don't use averages (or in general if you don't "reduce" the information of a sample) that the usual Gibbs distribution maximizes the usual entropy.

This is very particular case of core examples of ergodic theory, although to believe that it is possible to compute, even without averaging, is really courageous.