Convergence speed of the tail of distribution using Tauberian remainder theorem

39 Views Asked by At

This question is related to this.

Now I try to make some statistical inference using Laplace transform, but I face the following problem.

Let $f$ be some one-sided probability distribution defined on $[0,\infty)$, and $\hat{f}$ be its Laplace transform.

Now assume that we only have information of $\hat{f}(s)$ near $s=0$ (for example, $\hat{f}(s) = \hat{g}(s) / (2-\hat{g}(s))$ with known $\hat{g}(s)$), and using this, we want to find the convergence speed of the tail:

\begin{align*} \int_x^{\infty} f(x)\, dx = O(?). \end{align*}

According to Tauberian remainder theory in J. Korevaar's book (Examples 2.3., page 348), he said

  • If $|\hat{f}(s) - \hat{f}(0)| \le Cs^{\alpha}$ with some $\alpha > 0$, then the tail rate is $O(1/\log x)$;
  • If $|\hat{f}(s) - \hat{f}(0)| \le Ce^{-\alpha/s}$ with some $\alpha > 0$, then the tail rate is $O(1/\sqrt{x})$.

In fact, this is a disaster for statisticians because any higher-order Taylor approximation of $\hat{f}(s)$ near $s=0$ cannot guess whether the tail is light or super super heavy ($1/\log x$).

So the question is the following.

Question. What other condition on $\hat{f}$ near $0$ is required to guarantee the tail has at least a power-tail $O(x^{-\beta})$? If the power tail cannot be guaranteed by any of information about $\hat{f}$ near $s=0$, then what condition is required for $\hat{g}$?

Any help would be appreciated (suggesting books, papers, or anything!)

Thanks for the reading,