Checking non-negativity over an interval

107 Views Asked by At

I have been working on non-negative univariate polynomials and I found the following equivalent relationship to check if a polynomial is non-negative or not:

The polynomial $g(x) = \sum_{r=0}^k y_rx^r$ satisfies $g(x) \geq 0$ for all $x \in [0, a]$ if and only if there exists a positive semidefinite matrix $X = [x_{ij}]_{i,j=0,...,k}$, such that \begin{equation*} \sum_{\substack{i,j=0,\ldots,k \\ i+j=2l-1}}x_{ij}=0, \qquad l=1,\ldots, k \end{equation*} \begin{equation*} \sum_{\substack{i,j=0,\ldots,k \\ i+j=2l}}x_{ij}=\sum_{m=0}^l y_r {k-r\choose l-m}a^{r}, \qquad l=0,\ldots, k. \end{equation*}

This result is found in Bertsimas, D., & Popescu, I. (2005). Optimal inequalities in probability theory: A convex optimization approach. SIAM Journal on Optimization, 15(3), 780-804.

In the proof, found in the Appendix of the paper, it says:

We observe that $g(x)\geq 0$ for $x\in [0,a]$ if and only if $$(1+t^2)^k g\left(\frac{at^2}{1+t^2}\right) \geq 0 \qquad \text{for all }t.$$

Why does this equivalence hold? I see that if $g(x)\geq 0$, then the composed polynomial must be non-negative too, but I am unable to see why the other implication is also true.

Moreover, I am curious if similar results hold for higher dimensions. In particular, given a bivariate polynomial $f(x,y)=\sum_{r=0}^k\sum_{s=0}^l y_{r,s}x^rz^l$, can we find a characterization of non-negativity over a rectangle $[0,a]\times [0,b]$ using similar arguments? Maybe using a function like $$(1+t^2)^{k} (1+u^2)^{l}g\left(\frac{at^2}{1+t^2}, \frac{bu^2}{1+u^2}\right)$$ or something similar...

Any help would be appreciated.

1

There are 1 best solutions below

3
On BEST ANSWER

Direct implication: You can easily see that $0 \leq \dfrac{a t^2}{1+t^2} < a$ for $t \in \mathbb{R}$, in particular, $\dfrac{a t^2}{1+t^2} \in [0, a]$ and, therefore $$ g\left( \dfrac{a t^2}{1+t^2} \right) \ge 0, \quad t \in \mathbb{R}. $$

On the other hand, since $(1+t^2)^k > 0$, this is equivalent to $$ (1+t^2)^k g\left( \dfrac{a t^2}{1+t^2} \right) \ge 0, \quad t \in \mathbb{R}. $$

The reverse implication follows trivially because if $ x \in [0, a)$, then $x = \frac{a t_0^2}{1+t_0^2}$ for some $t_0 \ge 0$. The case $x = a$ can be obtained by continuity of $g$.

The multivariable extension you propose works in the exact same way.