Suppose I wish to minimise the integral
$$I = \int_{s_0}^{s_1}\int_{t_0}^{t_1}F\, dt ds$$
Where $F$ is a function of the six variables $x(s,t)$, $y(s,t)$, and their four partial derivatives, ie
$$F = F(x(s,t),y(s,t),x_s(s,t),x_t(s,t),y_s(s,t),y_t(s,t))$$
Subject to the constraint
$$G = 0$$
for all $s$ and $t$, where $G$ is also a function of those six variables.
As far as I understand, I construct $L=F-\lambda G$, and turn the machinery of Variational Calculus (ie the Euler-Lagrange equation) to minimising
$$J = \int_{s_0}^{s_1}\int_{t_0}^{t_1}L\, dt ds$$
If we apply the Euler Lagrange Equations
$$\frac{\partial}{\partial t}\left(\frac{\partial L}{\partial x_t}\right)+\frac{\partial}{\partial s}\left(\frac{\partial L}{\partial x_s}\right) = \frac{\partial L}{\partial x}$$
and similarly for $y$ and $\lambda$, we get two second order equations for $x$ and $y$ and reclaim $G=0$. Suppose that the second order equations are not analytically solvable. How do I determine $\lambda$?
In addition, how does this enforce $G=0$. Surely if, for example, $\lambda>0$, then a solution which makes $G$ very negative (and hence also $L$) would prevail as one that minimises the integral of $L$? Do we need to require that $G>0$ as part of our formulation of $G$ (by, for example, squaring it)?