Rigorous bounds for robust multivariate polynomial regression -- how does the number of data points scale with degree?

26 Views Asked by At

Suppose I have a data set generated by a multivariate polynomial function, for example the degree $K$ multivariate polynomial: $$f(x_1,\dots, x_k) = \sum_{n_1+n_2+\dots + n_k\leq K} a_{n_1n_2\dots n_k}x_1^{n_1}x_2^{n_2}\dots x_k^{n_k}, $$ where $x_i\in [-1,1]$ for all $i$. My dataset is the set of pairs $\{(x_1,\dots x_n)_i,y_i\}_i$, where we have the guarantee that $$ |y_i - f(x_1,\dots x_n)_i|<\eta$$ with probability greater than $(1-\delta)$, and where we assume that the $x_i$ are iid chosen from the uniform distribution over $[-1,1]$. Suppose that $|f(x_1,\dots x_n)|<1$ everywhere too (if this helps).

Are there any known algorithms which reconstruct an approximation $g$ to $f$ with error guarantees, such as $|g-f|<\epsilon$? I'm particularly interested in how the number of samples needed scales in terms of $K, \eta$ and $\epsilon$.

I believe this is known in the uni-variate case, for example: https://arxiv.org/abs/1708.03257