number of points needed to check if a special polynomial remainder is zero

104 Views Asked by At

We know $f(x,y)$ is a polynomial in $x$ and $y$ with integer coefficients and degree of $x$ and $y$ less than or equal to two:

$f(x,y)=\sum_{i,j\epsilon \left \{ 0,1,2 \right \}}a_{ij}x^i y^j$

where $a_{ij}$'s are integers. We are interested to check whether $a_{00}+a_{02}+a_{20}+a_{22}=0$ by evaluating $f$ at some points in the $xy$ plane.

One solution to that is to evaluate $f$ at following four points $(1,1), (1,-1), (-1,1), (-1,-1)$ then check if $ S = f(1,1)+f(1,-1)+f(-1,1)+f(-1,-1)$ is zero or not. It is easy to show that $S = 4(a_{00}+a_{02}+a_{20}+a_{22})$. The question is whether or not we could check if $S=0$ by evaluating $f(x,y)$ at fewer points (i.e. 3 points or fewer)?

What if the coefficients are not integers and/or we are using a computer with limited arithmetic precision to do all the computations.

Note that $a_{00}+a_{02}+a_{20}+a_{22}$ is the remainder of $f(x,y)$ divided by $(x^2-1), (y^2-1), x, y$. or in other words it equals to $f(x,y)$ when $ x^2\to1, x\to0, y^2\to1, y\to0$. (not sure if these would help)

2

There are 2 best solutions below

2
On BEST ANSWER

Assuming that we ignore the possibilities that a single value of $f$ could yield information about several linear combinations of the coefficients by virtue of that value being calculated at an extension field of the field of coefficients (see my comment under the OP) ...

You need to calculate the value of $f$ at four points to be able to figure out the sum $a_{00}+a_{02}+a_{20}+a_{22}.$

This can be shown as follows. Assume contrariwise that knowing $f(P_i)$ at three points $P_i=(x_i,y_i), i=1,2,3,$ would suffice. Clearly we then have a relation of the form $$ a_{00}+a_{02}+a_{20}+a_{22}=a_1f(P_1)+a_2f(P_2)+a_3f(P_3) $$ that holds for some coefficients $a_i,i=1,2,3$, and all $f$ (we need the assumption that we are working over an infinite, or at least a large enough field to justify this step). For this to work we would need to be able to express the last row of the matrix $$ A=\left( \begin{array}{ccccccccc} 1 & x_1 & x_1^2 & y_1 & y_1 x_1 & y_1x_1^2 & y_1^2 & y_1^2x_1 & y_1^2x_1\\ 1 & x_2 & x_2^2 & y_2 & y_2 x_2 & y_2x_2^2 & y_2^2 & y_2^2x_2 & y_2^2x_2\\ 1 & x_3 & x_3^2 & y_3 & y_3 x_3 & y_3x_3^2 & y_3^2 & y_3^2x_3 & y_3^2x_3\\ 1 & 0 & 1 & 0 & 0 & 0 & 1 & 0 & 1 \\ \end{array} \right) $$ as a linear combination of the top three rows.

Let us first consider the case where $x_1,x_2,x_3$ are all distinct. In that case the top left $3\times 3$ corner in $A$ is invertible as a Vandermonde matrix. This means that there are unique coefficients $a_1,a_2,a_3$ such that $$ a_1(1,x_1,x_1^2)+a_2(1,x_2,x_2^2)+a_3(1,x_3,x_3^2)=(1,0,1). $$ Looking at the next three columns we then see that we must also have $$ a_1y_1(1,x_1,x_1^2)+a_2y_2(1,x_2,x_2^2)+a_3y_3(1,x_3,x_3^2)=(0,0,0). $$ By linear dependence we thus deduce that $a_iy_i=0$ for all $i$. Similarly the last three columns imply that $a_iy_i^2=a_i$ for all $i$. So whenever $a_i\neq0$ we must have both $y_i=0$ and $y_i^2=1$. This forces all the $a_i$s to vanish which is a contradiction.

Assume then that there are repetitions among the $x$-coordinates. Without loss of generality we can assume that $x_3=x_1$. Clearly then $x_1\neq x_2$ as otherwise the three first columns already give a contradiction. Again, the linear independence of $(1,x_1,x_1^2)$ and $(1,x_2,x_2^2)$ and looking at the columns in groups of three as above imply that $$ \begin{aligned} a_1+a_3&=a_1y_1^2+a_3y_3^2\\ a_2&=a_2y_2^2\\ a_1y_1+a_3y_3&=0\\ a_2y_2&=0. \end{aligned} $$ Again we get a contradictiion, because $a_2\neq0$, $a_2y_2=0$, $a_2y_2^2=a_2$.

Looks like this is generalizable, but I'm afraid I don't know exactly how.

1
On

If $f(x,y)=x^2$, then $a_{0,0}+a_{2,2}=0$ but $$f(1,1)+f(-1,1)+f(1,-1)+f(-1,-1)=4\ne0.$$

Now $$f(2,2)-4f(1,1)+6f(0,0)-4f(-1,-1)+f(-2,-2)=24a_{2,2}$$ and $f(0,0)=a_{0,0}$ so one can check that $a_{0,0}+a_{2,2}=0$ by evaluating at five points. I don't know if that is optimal.