Saddle Point Problem and Lagrangian

167 Views Asked by At

Let $X$, $M$ be Hilbert spaces. Consider two bilinear forms $a(\cdot,\cdot):X \times X\to \mathbb{R}$ and $b(\cdot,\cdot): X\times M\to \mathbb{R}$, and linear maps $f:X\to \mathbb{R}$, $g:M\to\mathbb{R}$. Let $J(u) = \frac{1}{2}a(u,u)-f(u)$. Consider the saddle point problem, find $(u,\lambda)$ such that \begin{align*} a(u,v)+b(v,\lambda)&=f(v), \forall v\in X,\\ b(u,\mu) &= g(\mu), \forall \mu\in M. \end{align*} Let the Lagrangian be $\mathcal{L}(u,\lambda) = J(u)+b(u,\lambda)-g(\lambda)$. How do I show that the solution to the saddle point problem is a saddle point to the minimization problem with the given Lagrangian, i.e. $\mathcal{L}(u,\mu)\leq\mathcal{L}(u,\lambda)\leq \mathcal{L}(v,\lambda)$, for all $v,\mu$. In particular, I cannot get the second part of the inequality. This problem is from the book by Dietrich Braess, Finite Elements (p.129).

1

There are 1 best solutions below

0
On

I just ran into the same problem but a friend of mine helped me out. The Left half should be clear, since $L(u,•)=J(u)$, so no matter what you plug in as the second argument, its always equal. For the second part consider $L(u+w,\lambda)$ with u of the saddle point and arbitrary $w\in X$. There you have to use the symmetrie of a (which is given) and the first equation. Then you end up with $L(u+w,\lambda)=J(u)+\frac{1}{2}u(v,v)\geq J(u)=L(u,\lambda)$ and you are done. I hope even if the answer is too late it helps.