For what parameters does a sequence converge in $S$

177 Views Asked by At

Let $S$ be space of rapidly decreasing functions $f\in C_0^\infty(\mathbb R^n)$, that for any multi-indices $\alpha$ and $\beta$ there is a constant $M_{\alpha,\beta}$ such that $$|x^\alpha D^\beta f(x)|\leq M_{\alpha,\beta}.$$

We have a countable family of seminorms $$\|f\|_k = \max\limits_{|\alpha|\leq k}\sup\limits_{x\in\mathbb R^n}\left|(1+|x|^k)D^\alpha f(x)\right|.$$

Convergence is defined as follows: $f_n\to f$ in $S$, if $\|f_n-f\|_k\to0\,\forall k=0,1,2,\dots$ as $n \to \infty$

Here's the problem:

Let $a$ and $b$ be real parameters. For what values of the parameters does a sequence $\frac{1}{n^a}e^{-n^b|x|^2}$ converge in $S$?

Solution attempt

We know that $S$ is complete. Thus it's sufficient to find values of the parameters, for which the sequence in question is Cauchy.

$$\|f_n-f_m\|_k = \max\limits_{|\alpha|\leq k}\sup\limits_{x\in\mathbb R^n}\left|(1+|x|^k)D^\alpha\left(\frac{1}{n^a}e^{-n^b|x|^2}-\frac{1}{m^a}e^{-m^b|x|^2}\right)\right|$$

If $a > 0, b = 0$, then Cauchyness is easy to prove.

But I don't know what to do with the rest of the cases.

Were it not for $(1+|x|^k)$ I could use Fourier transform to make multiplication instead of differentiation.

Could you give me any hints?

1

There are 1 best solutions below

0
On BEST ANSWER

Not a complete answer, but it increases our knowledge nevertheless. If the sequence converges in $\mathcal S$, then it converges pointwise.

Therefore, if $a=0$, we have the following cases:

  1. $b=0$ nothing interesting here. The sequence is constant.

  2. $b<0$ the sequence converges pointwise to a constant function $f(x)=1\quad \forall x$

  3. $b>0$ the sequence converges pointwise to a function $$f(x) =\begin{cases}1,&x=0,\\0,&x\ne 0.\end{cases} $$

Therefore we can safely rule out the case $a=0$, $b\ne 0$ - the limit is not in $\mathcal S$, hence the sequence does not converge in $\mathcal S$.

Similarly, if $a<0$, then sequence does not converge in $\mathcal S$ either, because the pointwise limit in $x=0$ is infinity.

Hence, the only interesting case would be $a>0$. If $b=0$, then the sequence converges to zero.

At last, we need to study $a>0,\, b>0$ and $a>0,\,b<0$.

For the case $a>0,\,b>0$ we will consider the derivatives of even order with respect to $x_1$ taken in $x=0$. One can show that this derivative of order $2k$ will be equal to $2^kn^{kb-a}s_n$ where $s_n$ is a certain sequence of integers with $|s_{k+1}|>|s_k|$. The first elements are $-1,\,3,\,-15,\,105,\,-945$, etc. I.e. the hypothesis is $s_n=(-1)^n (2k-1)!!$.

Therefore for sufficiently large $k$ the sequence $2^kn^{kb-a}s_n$ becomes unbounded, hence we lose the convergence.

The final case is $a>0,\,b<0$, which I'm yet to find a good grip on.