Why does Titchmarsh say that we can move the derivative under $\frac{2}{\pi}\int_0^\infty \frac{\Xi(t)}{t^2 + \frac{1}{4}} \cosh(\alpha t) \, dt$

174 Views Asked by At

If we define the Riemann-Xi function as $$ \Xi(t) = \xi(\frac{1}{2} + it)$$ where $$\xi(s) = \frac{1}{2}s(s-1)\pi^{-\frac{s}{2}}\Gamma(\frac{s}{2})\zeta(s),$$

then according to Titchmarsh in his adaptation of Hardy's proof that the zeta function has infinitely many zeros on the critical line, if we consider the integral

$$ \frac{2}{\pi}\int_0^\infty \frac{\Xi(t)}{t^2 + \frac{1}{4}} \cosh(\alpha t) dt, $$ then "since $\zeta(\frac{1}{2} + it) = O(t^A)$, $\Xi(t) = O(t^Ae^{-\frac{1}{4}\pi t})$, and the above integral may be differentiated with respect to $\alpha$ any number of times provided that $\alpha < \frac{1}{4}\pi$."

I don't really understand either of the claims; that is, why is $\Xi(T) = O(t^Ae^{-\frac{1}{4}\pi t})$ and why does this imply that we can move the derivative under the integral sign any number of times as long as $\alpha < \frac{1}{4}$. Can someone explain these things?

Here is my work so far:

After plugging the definition of $\xi(s)$ into the definition of $\Xi(t)$ we get that

$$ \Xi(t) = \frac{1}{2}(-\frac{1}{4} - t^2)\pi^{-\frac{1}{4} - \frac{it}{2}}\Gamma(\frac{1}{4} + \frac{it}{2})\zeta(\frac{1}{2} + it).$$

I can prove that for $|t| \geq 1$ and $Re(s) \geq \frac{1}{2}$, $|\zeta(s)| \leq |t|^{\frac{1}{2} + \epsilon}$ and I can also show that $|\Gamma(\frac{1}{4} + \frac{it}{2})| \leq \frac{\Gamma(\frac{1}{4})}{|t|}. $

Additionally, $|\pi^{-\frac{1}{4} - \frac{it}{2}}| = \pi^{-\frac{1}{4}} = e^{-\frac{1}{4} \log(\pi)}$. I'm stuck after this. Can anyone help?

1

There are 1 best solutions below

3
On BEST ANSWER

The bound on the $\Xi$ function is given by Stirling's formula for the Gamma function. The main contributing term is $$\left(\frac{1}{4}+\frac{it}{2}\right)^{\frac{1}{4}+\frac{it}{2}}=\exp\left(\left(\frac{1}{4}+\frac{it}{2}\right)\log\left(\frac{1}{4}+\frac{it}{2}\right)\right)$$ and by the definition of the complex logarithm this is $$=\exp\left(\left(\frac{1}{4}+\frac{it}{2}\right)\log|\frac{1}{4}+\frac{it}{2}|+i\arg\left(\frac{1}{4}+\frac{it}{2}\right)\right).$$ Now, $\arg\left(\frac{1}{4}+\frac{it}{2}\right)=\frac{\pi}{2}+O\left(\frac{1}{t}\right),$ and so by combining in the other terms we have that $$\Gamma\left(\frac{1}{4}+\frac{it}{2}\right)=O\left(t^{A}e^{-\frac{1}{4}\pi t}\right)$$ for some constant $A$.

As for the differentiation under the integration sign, this works since our function is infinitely differentiable and decays very quickly. Theorem 2.27 (b) of Folland's Real Analysis states that

Theorem: Suppose that $f:X\times [a,b]\rightarrow\mathbb{C}$ and that $f(\cdot,t):X\rightarrow \mathbb{C}$ is integratable for each $t\in[a,b]$. Let $F(t)=\int_X f(x,t)d\mu(x)$. Suppose that $\partial f/\partial t$ exists and that there is a real $g\in L^1(\mu)$ such that $|(\partial f/\partial t)(x,t)|\leq g(x)$ for all $x,t$. Then $F$ is differentiable and $F'(x)=\int (\partial f/\partial t)(x,t)d\mu(x)$.

And from this we see that the decay condition is enough to differentiate as many times as we would like. Alternatively, we may use the theorem appearing on the wikipedia page:

Theorem Let $f(x,t)$ be a function such that both $f(x,t)$ and its partial derivative $f_x(x,t)$ are continuous in $t$ and $x$ in some region containing $a(x)\leq t\leq b(x)$, $x_0\leq x\leq x_1$. Also suppose that the functions $a(x)$ and $b(x)$ are both continuous and both have continuous derivatives for $x_0\leq x\leq x_1$. Then for '$x_0\leq x\leq x_1$: $$\frac{\mathrm{d}}{\mathrm{d}x} \left (\int_{a(x)}^{b(x)}f(x,t)\,\mathrm{d}t \right) = f(x,b(x))\cdot b'(x) - f(x,a(x))\cdot a'(x) + \int_{a(x)}^{b(x)} f_x(x,t)\; \mathrm{d}t.$$

In our case, all of the functions are infinitely differentiable and the decay condition removes the terms at $0$ and $\infty$ so that we may simply take the derivative under the integral sign.