I know, that the following statement can easily proved by induction: If $A_1, \dots, A_n \subset k$ are infinite sets and $f \in k[x_1, \dots, x_n]$ vanishes on $A = A_1 \times \dots \times A_n$, then $f$ is already the zero polynomial.
Now my question is, whether the sets $A_i$ have to be infinite. I think, that it suffices to ask for $(\operatorname{deg}_{x_i}f) +1 \le |A_i| $, where $\operatorname{deg}_{x_i}f$ is the degree of $f$ considered as a polynomial in $x_i$ with coefficients in $k[x_1, \dots, x_{i-1}, x_{i+1}, \dots, x_n]$. It seems, that the proof of the original statement can be applied to this situation too. It goes as follows:
If $n = 1$, then $f$ is a polynomial in one variable vanishing on $A_1$, and since $(\operatorname{deg}f) +1 \le |A_1| $, this means, that $f$ has at least $(\operatorname{deg}f) +1$ roots, so it must be the zero polynomial.
Now assume, that $n > 1$ and that the statement is true for $n-1$. Consider $f = \sum_i f_i(x_1, \dots, x_{n-1}) x^i$ as an element of $k[x_1, \dots, x_{n-1}][x_n]$. Let $(a_1, \dots, a_{n-1}) \in A_1 \times \dots \times A_{n-1}$ be arbitrary. It follows that $f(a_1, \dots, a_{n-1}, x_n) \in k[x_n]$ vanishes on $A_n$, so it has at least $|A_n| \ge (\operatorname{deg}_{x_n}f) +1$ roots, so it must be the zero polynomial. Hence $f_i(a_1, \dots, a_{n-1}) = 0$. Since $(a_1, \dots, a_{n-1}) \in A_1 \times \dots \times A_{n-1}$ was arbitrary and since $(\operatorname{deg}_{x_j}f_i) +1 \le (\operatorname{deg}_{x_j}f) +1 \le |A_j|$ for all $1 \le j \le n-1$, we obtain by induction hypothesis that $f_i$ is the zero polynomial for all $i$.
Is there any mistake in this proof?