We have a polynomial in three variables with maximum degree of 2 for each variable:
$f(x,y,z)=\sum_{i=0}^{2} \sum_{j=0}^{2} \sum_{k=0}^{2} \beta_{ijk} x^i y^j z^k$
We could separate terms with different powers of $x$ and rewrite $f$ as following:
$f(x,y,z) = a_{x,0} + a_{x,1} x + a_{x,2} x^2$
where $a_{x,0}, a_{x,1}, a_{x,2}$ are polynomials in $y,z$. This could be done similarly for $y$ and $z$.
1) Under what conditions (for what class of functions $f$), does there exist a solution to the following system of equations ?
$ a_{x,0} + a_{x,2} x^2 =0 $
$ a_{y,0} + a_{y,2} y^2 =0 $
$ a_{z,0} + a_{z,2} z^2 =0 $
$ f(x,y,z) = \beta_{111} x y z$
Here is an example $f$ for which there exist a solution:
$f = (1-x y) (1+x+z-x z) (1-y z) = 1+x-x y-x^2 y+z-x z-y z \mathbf{-2} x y z+x^2 y z+x y^2 z+x^2 y^2 z-y z^2+x y z^2+x y^2 z^2-x^2 y^2 z^2$
Then the following is a solution:
$ x=0$, $y=1$, and $z=-1$
another solution is:
$x=0.214413$, $y=2.41421$, $z=-0.800199$
one could verify that for both of these solutions, $f(x,y,z)=\beta_{111}xyz=-2xyz$
Here is another example for which no solutions exist: $f(x,y,z) = 5x-2xyz+1$ .There is obviously no solutions for the system:$1+0x^2=0, (1+5x)+0y^2=0,(1+5x)+0z^2=0,5x-2xyz+1=-2xyz $
2) (bonus) what if we are only interested in solutions where $xyz \neq 0$
3) (bonus) what if instead of 3 variables, we had $n$ variables ?
EDIT: solutions to the case $n=2$, i.e. only two variables would also be of interest!
A little bit of background: I am trying to develop an algorithm to check the coefficient for $x_1 x_2 ... x_n$ in a polynomial (which is given as product of several lower order polynomials such as the one in the example). More formally, we want to check whether $\beta_{11...1} = \frac{\partial^n }{\partial x_1 \partial x_2 ... \partial x_n} f \bigg|_{x_1=x_2=...=x_n=0}$ is equal to zero or not. In general this is an NP hard problem. However, I am interested in the class of such polynomials where this could be checked efficiently. For example if a nonzero solution to the above system of equations exist, one could solve the first $n$ equations numerically, and then find $\beta_{11..1}= \frac{f({x_1}^*,{x_2}^*,...,{x_n}^*)}{{x_1}^*{x_2}^*...{x_n}^*}$. (see here for more background and proof for complexity of the general case).
Here is some effort for solution to the case $n=2$
$ a_{x,0} + a_{x,2} x^2 =0 $
$ a_{y,0} + a_{y,2} y^2 =0 $
$ f(x,y) = \beta_{11} x y$
We could rewrite these three equations as following:
$ f(x,y) = x (\beta_{10}+\beta_{11}y+\beta_{12}y^2)$
$ f(x,y) = y (\beta_{01}+\beta_{11}x+\beta_{21}x^2)$
$ f(x,y) = \beta_{11} x y $
Subtracting the third equation from the first two we have:
$ x (\beta_{10}+\beta_{12}y^2) = y (\beta_{01}+\beta_{21}x^2) =0 $
Comparing these two equations with the first two equations, it is not hard to see that if $f$ is separable (i.e. $f = f_1(x_1) f_2(x_2) ... f_n(x_n)$), then the two sets would be equivalent.