Weak Minimizer of a Functional

794 Views Asked by At

I showed that $u(x) = \frac{x^2}{2}$ is a potential minimizer for the functional $\int_0^2 \frac{n}{2}u'(x)^2-nu(x) \, dx$ in $C^2[0,2]$ with $u(0) = 0$ and $u(2)=2$ where $n$ is a positive constant (using Euler-Lagrange equation). I am now trying to prove that this is actually a weak local minimizer and am unsure how to proceed. By weak minimizer I mean using the $C^1$ norm.

I've been thinking about somehow using the first variation of a functional.

Thanks in advance for your help.

1

There are 1 best solutions below

0
On

Let $L(u,v) = \frac{n}{2} v^2-n u$. Let $C = \{u \in C^2[0,2] | u(0) =0, \, u(2) = 2 \}$. Let $\hat{u}(x) = \frac{1}{2}x (4-x)$. We note that $\hat{u}$ satisfies both the boundary conditions and the Euler–Lagrange equation, ie, $\frac{\partial L(\hat{u}(x), \hat{u}'(x))}{\partial u} = -n = n \hat{u}''(x) = \frac{d}{dx}\frac{\partial L(\hat{u}(x), \hat{u}'(x))}{\partial v}$.

Note that $C$ is convex, and $L$ is convex. Since $u \mapsto u'$ is linear, it follows that the function $u \mapsto L(u,u')$ is convex, and hence the functional $J$, with $J(u) = \int_0^2 L(u(x),u'(x)) dx$, is convex.

The problem is $P: \ \min_{u \in C} J(u)$. Since the objective and constraints are convex, $\hat{u}$ solves $P$ iff $dJ(\hat{u}, u-\hat{u}) \ge 0$ for all $u \in C$.

The directional derivative is given by $dJ(u,\delta) = \int_0^2 ( \frac{\partial L(u(x), u'(x))}{\partial u} \delta(x) + \frac{\partial L(u(x), u'(x))}{\partial v} \delta'(x) ) dx$.

It is straightforward to show that $dJ(\hat{u},\delta) = 0$ for any $\delta \in C^1[0,2]$ such that $\delta(0) = \delta(2) = 0$, hence $\hat{u}$ is a minimizer of $P$.

Since $\hat{u} \in C^2[0,2]$ (so that the the Euler Lagrange equation is satisfied), the optimality condition $dJ(\hat{u},\delta) = 0$ shows that the constraint set $C$ may be expanded to $C_{\text{relaxed}} = \{u \in C^1[0,2] | u(0) =0, \, u(2) = 2 \}$.