Let $a(x), b(x)$ be two positive functions that integrate to $1$. I am trying to minimize the following functional:
$$J(y) = \int_\mathbb R y(x) \log \frac{a(x)}{b(x)} dx$$ subject to the condition that $y(x) \ge 0$, $\int_\mathbb R y(x) dx = 1$.
In other words $y, a, b$ are probability density functions.
Note
For $y(x) = a(x)$, then the functional is the KL divergence. I am interested to see what happens when $y(x) \neq a(x)$, and in particular for which $y(x)$ we attain a minimum or a maximum.
Thoughts
I have not studied calculus of variations before, so I am very unsure about what I read. I tried to read the page here: http://liberzon.csl.illinois.edu/teaching/cvoc/node38.html but
1) It does not deal with inequality constraints and
2) I am having a hard time even setting up properly and solving the reduced problem with just integral constraint.
This question doesn't really require the calculus of variations $$ \begin{align} J(y) &=\int_{\mathbb{R}}y(x)\log\left(\frac{a(x)}{b(x)}\right)\,\mathrm{d}x\\ &\le\|y\|_1\sup_{x\in\mathbb{R}}\left(\log\left(\frac{a(x)}{b(x)}\right)\right)\\ &=\sup_{x\in\mathbb{R}}\left(\log\left(\frac{a(x)}{b(x)}\right)\right) \end{align} $$ which can be approached as $y$ tends to an approximation of the Dirac delta function centered at the supremum of $\log\left(\frac{a(x)}{b(x)}\right)$. If that doesn't have a supremum, then the maximum is $\infty$.
The same can be done for the minimum using the $\inf\limits_{x\in\mathbb{R}}$.
A hint that this doesn't have a good solution is that the Lagrange multiplier requires $\log\left(\frac{a(x)}{b(x)}\right)$ be a constant.