I am interested in computing a minimal surface on a domain $\Omega = [0,1]^{2}$. Specifically, I would like to fit some Ansatz function $z(x, y) = ax^{2}y^{2} + b x y + c x + d y$ with parameters $a, b, c, d \in \mathbb{R}$ such that the resulting surface is minimal.
Naturally, without any constraints on these parameters I would expect the solution to be a plane (i.e., satisfying $a = b = 0$). So, lets say I have some additional constraint $v(a,b,c,d) = 0$ (sufficiently smooth, at least $C^{2}$), for example modeling behavior at the boundary precluding this solution
The wikipedia article on minimal surfaces mentions Euler-Lagrange equations, i.e., additional PDE constraints. I believe that this would reduce the example problem to a nonlinear system of equations for the given $z$.
I would prefer instead to have a (smooth) objective based on $a, b, c, d$ instead, which could then be used to transform the problem into a (constrained) nonlinear programming problem. Can such an objective be derived (say from a variational formulation)?
After some literature research I arrived at the following:
First, the PDE constraints are likely not satisfied for all points $(x, y) \in \Omega$ for any choice of the parameters. The goal should be to compute an area as small as possible while satisfying constraint $v$. The area of the surface given by $z$ can be calculated as
$$ \int_{x \in [0,1]} \int_{y \in [0,1]} \sqrt{1 + \left(\frac{\partial z}{ \partial x}\right)^{2} + \left(\frac{\partial z}{ \partial y} \right)^{2}} \mathrm{d}y \mathrm{d}x $$
This objective is smooth in the parameters $a, b, c, d$, since the square root is bounded away from zero. To find a suitable solution, we can minimize this objective subjecting to the constraint $v(a,b,c,d) = 0$, yielding a nonlinear programming problem.