Entropy vs Relative Entropy.

127 Views Asked by At

Can anyone help me reconcile two well known observations.


$\textbf{Observation 1}$

The Second Law of Thermodynamics says that a system wants to minimise (the negative of entropy)

$$ H(\rho):=\int \rho \log \rho, $$

where $\rho$ is a probability density on $\mathbb{R}^d$.

$\textbf{Observation 2}$

If we consider some random variables on $\mathbb{R}^d$, $X_i\sim \mu$ for $i=1,2,\ldots$ with empirical distribution

$$ \rho_n:=\frac{1}{n}\sum_{i=1}^n \delta_{X_i} $$

then Sanov's Theorem says that $\rho_n$ satisfies

$$ \mathbb{P}(\rho_n\approx \rho) \approx \exp\big(-n H(\rho|\mu)\big), $$

where

$$ H(\rho|\mu):=\begin{cases} \int f\log f~&\text{if}~\rho \ll\mu ~\text{and}~\rho=f\mu \\ \infty~&\text{otherwise} \end{cases} $$ is Relative Entropy against $\mu$.


$\textbf{Question}$

Here its typical to think of $\mu$ as the stationary distribution of a stochastic process. Whats the link between these two observations, is the first just a time dependent version of the second?