Continuous distribution (known mean $\mu$ and variance $\sigma^2$) - Maximum Entropy: Given mean $\mu$ and variance $\sigma^2$, what is the continuous distribution that maximizes differential entropy $h(X) = -\int_x f(x) \log f(x) dx$? I want to prove this.
I tried to calculate the KL-divergence between two distributions and came up with h(X) which is the second part of the equation. I am unable to solve this further. How to prove this
$KL[f\parallel g]=\int_f(x) \log \frac{f(x)}{g(x)}dx$
$= \int_{X} f(x) \log f(x) \, \mathrm{d}x - \int_{\mathcal{X}} f(x) \log g(x) \, \mathrm{d}x$
$= - \mathrm{h}[f(x)] - \int_{\mathcal{X}} f(x) \log g(x) \, \mathrm{d}x$
The differential entropy is maximized by the normal distribution.