Take maximum differential entropy as an example: Gaussian achieves the maximum differential entropy when the second order moment is fixed.
The calculus of variation form: \begin{equation} \begin{split} \min & \int f(x)\log f(x) dx,\\ s.t. & \int f(x) dx=1,\\ & \int x^2 f(x) dx=\sigma^2 \end{split} \end{equation}
By using Lagrange multipliers and Euler's Equation, we can conclude Gaussian is the necessary solution for this problem. However, how to prove the sufficiency of Gaussian?
Most textbooks just end the proof with Lagrange multipliers (necessity), is that enough? I do not know too much about functional analysis. Can we move the theorems from convex function to this one? For example, prove the constraints form a compact set such that there is a minimum, and Gaussian is the only choice. Or prove Gaussian is the local minimum, since it's a convex function over a convex set, local=global?
Sorry for the confusion, I am asking how to prove the sufficiency in Calculus of Variations way.
Let $f(x)$ be the density of a zero-mean normal with variance $\sigma^2$, and let $g(x)$ be any other density, but having the same variance, $\sigma^2$. The fact that they have the same variance means that the following equality holds:
$$\int x^2g(x) = \int x^2f(x) \qquad [1]$$ Now consider the Kullback-Leibler Divergence of the two densities:
$$0\le D_{KL}(g||f) = \int g(x)\cdot\log\left(\frac {g(x)}{f(x)}\right) = \int g(x)\cdot\log[g(x)] -\int g(x)\log[f(x)] \qquad [2]$$
The first term is the negative 0f the entropy of $g$, denote it $-h(g)$. The second term is (writing explicitly the gaussian pdf)
$$\int g(x)\log[f(x)] = \int g(x)\log\left[\frac{1}{\sqrt {2\pi}\sigma}\exp\left\{-\frac 12 x^2/\sigma^2\right\}\right] $$ $$=\log\left[\frac{1}{\sqrt {2\pi}\sigma}\right]\int g(x)-\frac 1{2\sigma^2}\int x^2 g(x) $$ The first integral integrates to unity, and using also eq. $[1]$ we obtain
$$\int g(x)\log[f(x)] = \log\left[\frac{1}{\sqrt {2\pi}\sigma}\right] - \frac 1{2\sigma^2}\int x^2 f(x) $$ But this is the negative of the entropy of the gaussian , denote it $-h(f)$.
Inserting these results into eq. $[2]$ we have $$0\le D(g||f) = -h(g) - (-h(f)) \Rightarrow h(g) \le h(f)$$ Since $g$ was arbitrary, this proves that the gaussian is maximum entropy among all distributions with the same variance (fixed second moment).