Constraint optimization with Calculus of Variations. How to handle positive function constraint?

783 Views Asked by At

the I am attempting to maximize the functional $F[f]$ with a constrain that $f$ has to be non-negative and some other integral constraints. More, specifically, \begin{align*} &\max F[f]\\ &\text{ s.t. } f(x) \ge 0, \forall x\\ &\text{ and} \int_{-\infty} ^\infty f(x)dx= c \end{align*} where $c$ is some constant.

I decide to approach this with the method of Lagrangian multipliers (if you know a better method that would be great too). My Lagrangian equation is the following:

\begin{align*} L[f] =F[f]- \lambda_0(f(x)-0) -\lambda_1 \left(\int_{-\infty} ^\infty f(x)dx - c\right) \end{align*}

The problem now is that I have to take the variational derivative with respect to $f$. My main question is, how do I handle the first constrain, since it is not in the form of an integral?

Variational derivative of $f(x)$ with respect to $f(x')$(need a new dummy variable) is $\delta(x-x')$ which introduces a new dummy variable $x'$. So, we get \begin{align*} \frac{L[f]}{d f(x')} =\frac{F[f]}{d f(x')}-\lambda_0\delta(x-x') -\lambda_1 \end{align*}

My question is how do people generally deal with the constraint that the function has to be non-negative when using calculus of variation? Is there a work around this? Or introduction of a dummy variable is inevitable?

I would be grateful for any help. Thank you.