Is the gaussian function a Nash function on $\mathbb{R}$?

76 Views Asked by At

Let me explain the question. Take a gaussian function $f:t\in \mathbb{R} \mapsto \exp(-t^2)$. Does there exist a nontrivial polynomial $P\in \mathbb{R}[X,Y]$ such that $$ \forall t\in \mathbb{R},\qquad P(t,f(t))=0 ? $$ (If such a polynomial exists, then the function $f$ is called a Nash function on $\mathbb{R}$)

I know that $t\mapsto \exp (t)$ and $t\mapsto \sin t$ are not Nash functions (on $\mathbb{R}$) but the proofs are based on the periodicity of $\sin$ and the fact that $\exp$ is unbounded. For the gaussian, I don't know if the function is Nash or not. Do you know if it is or not ? And could you provide a proof or a reference ?

1

There are 1 best solutions below

0
On BEST ANSWER

I asssume $P(X, Y)$ is not allowed to be the trivial polynomial, i.e., that $P(X, Y) \ne 0$.

Then the answer is "No"; we can see this as follows:

Suppose there were such a polynomial $P(X, Y) \in \Bbb R[X, Y]$; then any such $P(X, Y)$ may be written as

$P(X, Y) = \displaystyle \sum_{n = 0}^N \sum_{i + j = n} p_{ij}X^iY^j, \tag{1}$

where $p_{ij} \in \Bbb R$ and $N$ is the degree of $p(X, Y)$; that is, the greatest of the values $i + j$ over all the terms $p_{ij}X^iY^j$ occurring in $P(X, Y)$. We write $P(X, Y)$ as the sum of two terms $r(X)$ and $q(X, Y)$, where

$r(X) = \displaystyle \sum_m p_{m0}X^m \tag{2}$

and

$q(X, Y) = \displaystyle \sum_{n = 1}^N \sum_{i + j = n, j \ge 1} p_{ij} X^i Y^j; \tag{3}$

$r(X)$ consists of those terms of $P(X, Y)$ which are of degree $0$ in $Y$; including the constant term $p_{00}$; all other terms, those of degree one or greater in $Y$, are absorbed into $q(X, Y)$. We thus have

$P(X, Y) = r(X) + q(X, Y). \tag{4}$

If we now take $X = t$ and $Y = e^{-t^2}$ into (4), we obtain

$P(t, e^{-t^2}) = r(t) + q(t, e^{-t^2}) = \displaystyle \sum_m p_{m0}t^m + \sum_{n = 1}^N \sum_{i + j = n, j \ge 1} p_{ij} t^i (e^{-t^2})^j, \tag{5}$

so we have by hypothesis

$r(t) + q(t, e^{-t^2}) = \displaystyle \sum_m p_{m0}t^m + \sum_{n = 1}^N \sum_{i + j = n, j \ge 1} p_{ij} t^i (e^{-t^2})^j = P(t, e^{-t^2}) = 0. \tag{6}$

Now the critical observation to be made at this point is that any term of the form $p_{ij}t^i (e^{-t^2})^j$, $i \ge 0$, $j \ge 1$, $p_{ij} \in \Bbb R$, obeys

$\lim_{t \to \infty} p_{ij}t^i (e^{-t^2})^j = 0; \tag{7}$

(7) is in fact very easy to see if we consider the expression

$t^{-i}e^{t^2} = t^{-i} \displaystyle \sum_0^\infty \dfrac{(t^2)^n}{n!} = \displaystyle \sum_0^\infty \dfrac{t^{2n - i}}{n!} = \displaystyle\sum_0^{2n < i}\dfrac{t^{2n - i}}{n!} + \sum_{2n \ge i}^\infty \dfrac{t^{2n - i}}{n!}; \tag{8}$

The first sum on the right of (8) clearly approaches $0$ as $t \to \infty$; as for the second sum, an application of the ratio test to successive terms yields

$\rho_n(t) = \dfrac{\dfrac{t^{2n + 2 - i}}{(n + 1)!}} {\dfrac{t^{2n - i}}{n!}} = \dfrac{t^2}{n+ 1} > 0. \tag{9}$

We see from (9) that for fixed $t$,

$\lim_{n \to \infty} \rho_n(t) = 0, \tag{10}$

which shows that the series

$s(t) = \displaystyle \sum_{2n \ge i}^\infty \dfrac{t^{2n - i}}{n!} \tag{11}$

converges. However, the ratio $\rho_n(t)$ may also be used to show that the sum $s(t)$ may take on arbitrarily large values for sufficiently large $t$, e.g., letting $n_0$ be the smallest $n$ such that $2n_0 > i$, we may choose $t$ such that $t^{2n_0 - i} / n_0!$ is as large as we please; since every term of $s(t)$ is positive for $t > 0$, it follows that

$s(t) > \dfrac{t^{2n_0 - i}}{(n_0!)}; \tag{12}$

thus we see that

$\lim_{t \to \infty}s(t) = \infty, \tag{13}$

and from this it follows that

$\lim_{t \to \infty} t^{-i}e^{t^2} = \infty, \tag{14}$

and thus

$\lim_{t \to \infty} t^ie^{-t^2} = 0; \tag{15}$

furthermore, since for $j \ge 1$

$t^i(e^{-t^2})^j = t^ie^{-t^2}(e^{-t^2})^{j - 1} \le t^ie^{-t^2}, \tag{16}$

we find that

$\lim_{t \to \infty}t^i(e^{-t^2})^j = 0 \tag{17}$

as well. Referring to (6) we see

$\lim_{t \to \infty}q(t) = \lim_{t \to \infty}\sum_{n = 1}^N \sum_{i + j = n, j \ge 1} p_{ij} t^i (e^{-t^2})^j = 0. \tag{18}$

Now suppose $r(t) \ne 0$; then, writing (6) as

$r(t) = -q(t), \tag{19}$

and letting $t \to \infty$, we see that the right hand side approaches zero whilst the left either approaches the constant $p_{00}$ or $\pm \infty$; this contradiction shows that we must have $r(t) = 0$, i.e., $p_{m0} = 0$ for all $m$.

Having dispensed with the coefficients $p_{m0}$ of $P(X, Y)$, we may divide

$P(t, e^{-t^2}) = q(t, e^{-t^2}) = \displaystyle \sum_{n = 1}^N \sum_{i + j = n, j \ge 1} p_{ij} t^i (e^{-t^2})^j = 0, \tag{20}$

by $e^{-t^2} \ne 0$, and find

$e^{t^2}P(t, e^{-t^2}) = \displaystyle \sum_{n = 1}^N \sum_{i + j = n, j \ge 1} p_{ij} t^i (e^{-t^2})^{j -1}$ $ = \displaystyle \sum_m p_{m1}t^m + \sum_{n = 1}^N \sum_{i + j = n, j \ge 2} p_{ij} t^i (e^{-t^2})^{j -1} = 0; \tag{21}$

we may now repeat the above argument to show that the $p_{m1} = 0$, and so on, until all the $p_{ij}$ are exhausted and we have shown that $P(X, Y) = 0$, the forbidden case.

This contradiction demonstrates that $e^{-t^2}$ is not a Nash function.