Asymptotics of $\sum_{n\geq 1 } \frac{\sin n^2 t }{ n^2 } $

113 Views Asked by At

This is the Riemann function.

I would like to determine its asymptotics as $t \rightarrow 0^+ $.

First let $t=x^2 $, so that we treat the series

$$ \sum_{n\geq 1 } \frac{\sin n^2 x^2}{ n^2}. $$

We use the Mellin transform. For that, we need the Mellin transform the function $\sin x^2$. By change of variables, and using the fact that the Mellin transform of the function $\sin x $ is $\Gamma(s) \sin \frac{\pi s }{2}$, we get

$$ \mathcal{M}[\sin x^2 ;s ] = \frac{1}{2}\Gamma(\frac{s}{2}) \sin\frac{\pi s }{4} , \;\;\;0 < Re(s) < 2. $$

The Mellin transform the original series is then

$$ \frac{1}{2}\Gamma(\frac{s}{2}) \sin\frac{\pi s }{4} \zeta(s+2) , \;\;\;0 < Re(s) < 2. $$

Now we use the inversion formula to recover the original series. It is

$$\frac{1}{2\pi i }\int_C x^{-s} \frac{1}{2}\Gamma(\frac{s}{2}) \sin\frac{\pi s }{4} \zeta(s+2) ds , $$

where we can take the integration contour $C$ to be the vertical line from $1-i\infty $ to $1+ i\infty $.

The common trick is then to shift the integration line to left. In this process, the poles of the integrand are rounded up one by one.

Now, the problem is that the gamma function has poles at $s = 0, -2, -4 ,-6,$ etc. The zeta function has a single pole at $s = -1$. The sine function has zeros at $s = 0 , -4 , -8, $ etc. The zeta function has zeros at $s = -4, -6, -8,$ etc.

After the cancellation, the rest poles are $s = -1 , -2 $. We get a quadratic polynomial of $x$.

This is obviously wrong. The Riemann function is non-differentiable almost everywhere.

Could anyone show me where I made a mistake?

2

There are 2 best solutions below

7
On

You didn't check the growth of $\frac{1}{2}\Gamma(\frac{s}{2}) \sin\frac{\pi s }{4} \zeta(s+2)$ on vertical lines which is needed to shift the contour on the left and deduce an asymptotic.

You can look at $$f_k(u)=f \ast \frac{k}{\sqrt{2\pi}}e^{- k^2 u^2/2}$$ where $$f(u) = \sum_{n=1}^\infty \frac{\sin(n^2 e^{-u})}{n^2}$$ then $$\mathcal{L}[f_k(u)](s) = \sin(\pi s/2)\Gamma(s) e^{s^2/(2k^2)} \zeta(2s+2)$$ this time everything is $L^1$ on vertical lines so that as $u \to \infty$, $$f_k(u) =\frac{1}{2i\pi} \int_{(-1/2-\epsilon)}\mathcal{L}[f_k(u)](s) e^{su}ds+ Res(\mathcal{L}[f_k(u)](s),s=-1/2)\\= \frac{\sqrt{2\pi}}{2} e^{1 /(8k^2)} e^{-u/2}+O(e^{-(1/2+\epsilon)u})$$

Since $f$ is uniformly continuous on $[a,\infty)$ and bounded then $f_k \to f$ uniformly on $[a,\infty)$. Then we can check if the same holds for $f_k u^{-m} e^{ub}\to f u^{-m}e^{ub} $ for $b \le 1/2$

4
On

Let $F(z)=\sum_{n \geq 1}{\frac{e^{in^2\pi z}}{n^2}}$, bounded holomorphic on the upper plane $\mathbb{H}=\Im(z)>0$ and with a bounded continuos extension to the boundary (the reals), $R(x)=\Sigma{\frac{\sin{n^2\pi x}}{n^2}}$ its imaginary part on the reals being the function above up to a $t \to \pi x$ transformation, while customarily the cosine function (the real part of $F$) is called $C(x)$.

Using that $F$ is (almost) a primitive of the Jacobi function $\theta_0(z)=\sum_{n \in \mathbb{Z}}q^{n^2}, q=e^{i\pi z}$ as obviously $F'(z)=\frac{i\pi}{2}(\theta_0(z)-1)$, one can prove (by manipulating the functional equation of the Jacobi functions and the defintion of $F$) the following results:

1: For real $x, F(x+1)=\frac{1}{2}F(4x)-F(x)$; this follows by simple manipulations from the definition of $F$

2: For real positive $x, F(x)=F(0)+i\pi e^{\frac{i\pi}{4}}\sqrt{x}-{\frac{i\pi}{2}}x+e^{\frac{i\pi}{4}}x^{\frac{3}{2}}F(-\frac{1}{x})-\frac{3}{2}e^{\frac{i\pi}{4}}\int_{0}^{x}\sqrt{t}F(-\frac{1}{t})dt$

3: In particular since $F$ bounded,we get for $x>0, x \to 0, F(x)=F(0)+i\pi e^{\frac{i\pi}{4}}\sqrt{x}-{\frac{i\pi}{2}}x+O(x^{\frac{3}{2}})$

which gives the precise asymptotics at zero by taking the imaginary part and noting that $R$ is odd; from relation 1, we get the differentiability of $R$ at 1, with $R'(1)=-\frac{\pi}{2}$ and more generally $F'(1)=-\frac{i\pi}{2}$; also the crucial functional equation 2 above (more or less) immediately shows that $F, R, C$ are simultaneously differentiable or not at $x>0, -x, x+2, -\frac{1}{x}$, with same derivative as at $1$ so we get the famous result that $R$ is differentiable at rational numbers which are $\frac{odd}{odd}$ with derivative $-\frac{\pi}{2}$ and not differentiable at rationals with denominator or numerator even in lowest form.

The proof of Hardy-Littlewood that $F,R, C$ are not differentiable at the irrationals as they are not Lipschitz of order $\frac{3}{4}$ is quite hard and depends on deeper facts about the Jacobi function and the modular group.