Jensen's inequality and convex Lagrangian

125 Views Asked by At

I was reading some lecture notes, and there was a following example that I didn't quite understand. If we have a following variational problem: $ \int_{a}^{b}f(u'(x))dx$ where the Lagrangian $f$ is a convex functional, $f \in C^1$, and $f$ depends only on the derivative of $u$. Without a proof, there was given that the global minimizer of this variational problem is the following function: $u:=\frac{\beta-\alpha}{b-a}x + \alpha$ where $u(a)=\alpha$ and $u(b)=\beta$ are the boundary conditions. How can I show that this makes sense? I tried using the Jensen's inequality but I can't get to the final answer.

Another possibility was using the convexity properties of the Lagrangian, but I didn't get to use any of the properties that were given for $u$.

I would like to derive this given answer but I am not sure how.

1

There are 1 best solutions below

0
On

Assume $b>a$ and so $\frac{1}{b-a}>0.$

Since $f$ is convex, Jensen's integral inequality holds: (https://en.wikipedia.org/wiki/Jensen%27s_inequality - Form involving a probability density function).

As $\int_{a}^{b}\frac{1}{b-a}dx$ is a probability measure $f ( \int_a^b u'(x)\frac{1}{b-a}dx) \leq \int_{a}^{b} f(u'(x))\frac{1}{b-a} dx$ with equality, if $u'$ is constant, so if $u$ is a linear function. The linear function that satisfies the boundary condition $u(a)=\alpha$ and $u(b)=\beta$ is $u(x) = \frac{\beta - \alpha}{b-a}(x - a) + \alpha.$

So this $u$ is a global minimizer, but if $f$ is only convex, uniqueness is not guaranteed. It is however guaranteed, when $f$ is strictly convex.