Inequality for the Laplace transform of a density function

417 Views Asked by At

Let $X$ be a positive random variable with density function $f$ and $\mathbb{E}X<\infty$, let $a>0$ such that $a\cdot\mathbb{E}X<1$. Denote the Laplace transform of $f$ by $$\hat{f}(s)=\int_0^{\infty}e^{-sx}f(x)dx.$$ I don't understand why the following inequaltiy holds: $$a\left(\frac{1}{s}-\frac{\hat{f}(s)}{s}\right)<1~~\forall s>0$$

1

There are 1 best solutions below

0
On

Since $\hat{f}(s)=\int_0^{\infty}e^{-sx}f(x)dx=s\int_0^{\infty}e^{-sx}F(x)dx$ we get $$\int_0^{\infty}e^{-sx}(1-F(x))dx=\frac{1-\hat{f}(s)}{s}$$ and therefore $$a\left(\frac{1-\hat{f}(s)}{s}\right)=a\left(\int_0^{\infty}e^{-sx}(1-F(x))dx\right)\leq a\int_0^{\infty}1-F(x)dx=a\mathbb{E}X<1.$$