The input are triples $\left\{ x,y,v\right\}$ where $x,y,v \in \mathbb{R} ^{+}$
I need to find function $f(x,y) = v$ by finding parameters of the following model
$f(x,y) = a + bx^c + dy^e $ where
- $f\left( x,y\right) >0,\forall x,y\in \mathbb{R} ^{+}$
- $f'\left( x,y\right) \ge 0,\forall x,y\in \mathbb{R} ^{+}$
Ideally I'd like to know how to compute both fitting surface (if any exists) and approximation (regression) with minimal squared error.
Finding regression for given model is trivial.
I'd like to know what is the best way to do it with constraints (here function $f$ must be positive and non-decreasing on given domain, which is positive reals)
Approximation: As you have observed the problem is just fulfilling the sign constraints. The partial derivatives are: $\frac{\partial f}{\partial x}=cbx^{c-1}$ and $\frac{\partial f}{\partial y}=dey^{d-1}$. Since $x$ and $y$ are positive the derivatives are positive if $cb > 0$ and $de>0$. To make sure that $f>0$ as well, you actually need all coefficients positive $a, b, c, d, e>0$. To see this assume for example $b<0$ and $c<0$: Sufficiently small $x$ values will lead to arbitrarily negative $bx^c$ values, which in turn will make $f$ negative.
So you need to ensure that your least squares optimisation will not run into negative values for the coefficients. The standard way to do this, is to reparameterise the coefficients by making them exponentials. I.e. you write $f(x,y)=\exp(\alpha) + \exp(\beta)x^{\exp(\gamma)} + \exp(\delta)y^{\exp(\eta)}$ and solve the unconstrained problem in terms of the new parameters $\alpha,\ldots,\eta$. This is a non-linear least squares problem which can be solved with standard numerical methods and tools.
Interpolation: If an interpolating solution exists, the least squares should find this, so no extra work required.