Tauberian theorem

318 Views Asked by At

Is there a Tauberian theorem that gives the lower tail of a random variable $X$ whose Laplace transform is $$\mathcal{L}_X(s) \sim 1 - \exp(-As^{-b}) \quad s \to \infty,$$ where $A$ is a positive constant and $b \in (0,1)$.

That is, given above expression, how do we calculate $F_X(\epsilon)$ as $\epsilon \to 0$, where $F_X(\cdot)$ is the cumulative distribution function of $X$?

1

There are 1 best solutions below

7
On BEST ANSWER

The assumption is equivalent to saying that

$$ \mathcal{L}_X(s) \sim \frac{A}{s^b} \qquad \text{as } s \to \infty.$$

Now assuming that $X$ is a non-negative random variable, then Karamata's proof of Hardy-Littlewood Tauberian theorem can be easily adapted to yield a proof the following proposition:

Proposition. Let $X$ be a non-negative random variable. If $A > 0$ and $b \geq 0$, then $$\mathcal{L}_X(s) \sim \frac{A}{s^b} \text{ as } s \to \infty \quad \Longleftrightarrow \quad F_X(\epsilon) \sim \frac{A\epsilon^{b}}{\Gamma(b+1)} \text{ as } \epsilon \to 0^+. \tag{*}$$

Proof. If $b = 0$, then $\text{(*)}$ immediately follows from

$$ \lim_{s\to\infty} \mathcal{L}_X(s) = \lim_{s\to\infty} \Bbb{E}[ e^{-sX} ] = \Bbb{E}[ \mathbf{1}_{\{X = 0\}} ] = \Bbb{P}(X = 0) = F_X(0). $$

Thus we may assume that $b> 0$.

The direction $(\Leftarrow)$ is straightforward. Assume the right-hand side of $\text{(*)}$. For each $\epsilon \in (0, A)$, there exists $\delta > 0$ such that

$$ \frac{(A-\epsilon) x^b}{\Gamma(b+1)} \leq F_X(x) \leq \frac{(A+\epsilon) x^b}{\Gamma(b+1)} \quad\text{whenever } 0 < x < \delta. $$

Then using the identity $\mathcal{L}_X(s) = s \int_{0}^{\infty} F_X(x) e^{-sx} \, dx$ (which follows from integration by part together with $F_X(0) = 0$), we have

\begin{align*} \limsup_{s\to\infty} s^b \mathcal{L}_X(s) &\leq \limsup_{s\to\infty} \bigg( \int_{0}^{\delta} s^{b+1} F_X(x) e^{-sx} \, dx + s^b e^{-s\delta} \bigg) \\ &\leq \frac{A+\epsilon}{\Gamma(b+1)} \lim_{s\to\infty} \int_{0}^{\delta} s^{b+1} x^b e^{-sx} \, dx \\ &= \frac{A+\epsilon}{\Gamma(b+1)} \lim_{s\to\infty} \int_{0}^{s\delta} t^b e^{-t} \, dt \qquad (t = sx) \\ &= A+\epsilon. \end{align*}

Since the limsup of $s^b \mathcal{L}_X(s)$ does not depend on $\epsilon$, this gives $\limsup_{s\to\infty} s^b \mathcal{L}_X(s) \leq A$. A similar consideration for the liminf of $s^b \mathcal{L}_X(s)$ proves the direction $(\Leftarrow)$.

So let us focus on the proof of $(\Rightarrow)$. To this end, let us assume that the left-hand side of $\text{(*)}$ is true for some $A > 0$ and $b > 0$. This implies that

$$ s^b \int_{0}^{\infty} e^{-sx} \, dF_X(x) = s^b \mathcal{L}_X(s) \xrightarrow[s\to\infty]{} A. $$

Define $\Sigma$ as the family of functions $f : [0, 1] \to \Bbb{R}$ satisfying the following conditions:

C1. There exists a sequence of polynomials $(p_n)_{n\geq 1}$ such that $p_n(x) \leq f(x)$ for all $x \in [0, 1]$ and $p_n(x) \uparrow f(x)$ for a.e. $x \in [0, 1]$.

C2. There exists a sequence of polynomials $(q_n)_{n\geq 1}$ such that $q_n(x) \geq f(x)$ for all $x \in [0, 1]$ and $q_n(x) \downarrow f(x)$ for a.e. $x \in [0, 1]$.

Now let $p$ be any polynomial and write $p(t) = \sum_{k=0}^{n} a_k t^k$ for $n \geq 0$ and $a_0, \cdots, a_n \in \Bbb{R}$. Then

\begin{align*} s^b \int_{0}^{\infty} e^{-sx}p(e^{-sx}) \, dF_X(x) &= \sum_{k=0}^{n} a_k s^b \int_{0}^{\infty} e^{-(k+1)sx} \, dF_X(x) \\ &\xrightarrow[s\to\infty]{} A \sum_{k=0}^{n} \frac{a_k}{(k+1)^b} = \frac{A}{\Gamma(b)} \int_{0}^{\infty} x^{b-1} e^{-x} p(e^{-x}) \, dx. \end{align*}

Thus if $f \in \Sigma$ and $(p_n)$ is a sequence of polynomials satisfying C1, then

\begin{align*} \liminf_{s\to\infty} s^b \int_{0}^{\infty} e^{-sx}f(e^{-sx}) \, dF_X(x) &\geq \lim_{s\to\infty} s^b \int_{0}^{\infty} e^{-sx}p_n(e^{-sx}) \, dF_X(x) \\ &= \frac{A}{\Gamma(b)} \int_{0}^{\infty} x^{b-1} e^{-x} p_n(e^{-x}) \, dx \\ &\xrightarrow[n\to\infty]{\text{DCT}} \frac{A}{\Gamma(b)} \int_{0}^{\infty} x^{b-1} e^{-x} f(e^{-x}) \, dx. \end{align*}

A similar consideration proves that

$$ \lim_{s\to\infty} s^b \int_{0}^{\infty} e^{-sx}f(e^{-sx}) \, dF_X(x) = \frac{A}{\Gamma(b)} \int_{0}^{\infty} x^{b-1} e^{-x} f(e^{-x}) \, dx. \tag{1}$$

On the other hand, it is easy to check that

$$ f(x) = \frac{1}{x}\mathbf{1}_{[e^{-1},1]}(x) \in \Sigma. $$

Indeed, $C([0, 1]) \subset \Sigma$ follows from Stone-Weierstrass theorem. Then the claim above follows by approximating $f$ from above and from below by continuous functions.

Then with $\epsilon = 1/s$, we have

\begin{align*} \lim_{\epsilon \to 0^+} \epsilon^{-b} F(\epsilon) &= \lim_{s \to \infty} s^b \int_{0}^{\infty} e^{-sx} f(e^{-sx}) \, dF_X(x) \\ &= \frac{A}{\Gamma(b)} \int_{0}^{\infty} x^{b-1}e^{-x}f(e^{-x}) \, dx \\ &= \frac{A}{\Gamma(b)} \int_{0}^{1} x^{b-1} \, dx = \frac{A}{\Gamma(b+1)}. \end{align*}

This completes the proof.