In minimization problems, say $\min_{u \in X} F(u)$, $X$ metric space, one of the steps of the Direct Method is the compactness of minimizing sequences: show that for any sequence $u_n$ such that $\sup_n F(u_n) < \infty$ we can extract a converging subsequence.
Consider now the following min-max problem (suppose $\omega$ Lipschitz domain, $X=H_0^1(\omega, \mathbb{R}^d)$, $Y=L^2(\omega, \mathbb{R}^d)$, $b \in X^*$)
$$\min_{u \in X} \max_{T \in Y} \ \langle \nabla u, T\rangle - \frac12\langle T, T \rangle - \langle b, u\rangle$$
- How to prove the boundedness of "saddling" sequences $(u_n, T_n)$? In other words, which is the counterpart of assuming $\sup_n F(u_n) < \infty$ for minmax problems?
- I'm aware of notions of variational convergence for saddle points, but I can't figure out how to prove weak sequential compactness. Is there any examples in the literature?
I think this problem can be reduced to minimising the Dirichlet energy. Let $u\in H_0^1(\omega)$ be fixated. Then for any $T\in L^2(\omega,\mathbb{R}^d)$ $$\langle \nabla u,T\rangle - \frac{1}{2}\langle T,T\rangle - \langle b,u\rangle = -\frac{1}{2}\|\nabla u - T\|^2_{L^2(\omega)} + \frac{1}{2}\langle\nabla u,\nabla u\rangle - \langle b,u\rangle.$$ This expression is maximal, if the norm vanishes, i.e. if $T=\nabla u$. Hence your problem reduces to $$\inf_{u\in H^1_0(\omega)}\sup_{T\in L^2(\omega,\mathbb{R}^d)}\langle \nabla u,T\rangle - \frac{1}{2}\langle T,T\rangle - \langle b,u\rangle = \inf_{u\in H^1_0(\omega)}\frac{1}{2}\langle\nabla u,\nabla u\rangle - \langle b,u\rangle.$$ Now define $$F(u):=\frac{1}{2}\langle\nabla u,\nabla u\rangle - \langle b,u\rangle$$ and you can proceed in the usual variational manner.