I have encountered following problem while I was working on something. For what follows, let $X$ be a non-negative random variable.
If we know $E(X^p)\leq \infty$, then it is well known that $x^pP(X>x) \to 0$ as $x\to \infty$. This also implies that $x^pP(X>x)$ is a bounded function in $x$.
Now, if we know $E(X) = \infty$, then are there any known lower bounds on $xP(X>x)$ under some "nice" assumptions ? (for example, X having a positive density on $[0,\infty)$ ).
For example, if $X$ has some mass at $\infty$, say $P(X= \infty) = p$, then $xP(X>x) \geq xp$ for all $x \geq 0$.
Let $X:\Omega\rightarrow[0,\infty)$ be a random variable. Let $p\in[1,\infty)$. We recall that we have the Robin's identity $$ E\left[X^{p}\right]=\int_{0}^{\infty}px^{p-1}P\left(X>x\right)dx. $$ For our case, in particular, we have that $\infty=E\left[X\right]=\int_{0}^{\infty}P\left(X>x\right)dx$. Denote $f(x)=P\left(X>x\right)$. Since $X$ is real-valued, we have that $f(x)\rightarrow0$ as $x\rightarrow\infty$. We consider functions of the form $g:[2,\infty)\rightarrow[0,\infty)$, $g$ is decreasing and $\int_{2}^{\infty}g(x)dx=\infty$. Clearly, $g$ cannot decrease too fast, e.g., $g(x)=\frac{1}{x^{2}}$ does not fit our criterion. A typical example is $g_{1}(x)=\frac{1}{x}$. For this example, we have that $xg_{1}(x)=1$. Another example is $g_{2}(x)=\frac{1}{x\ln x}$. For this example, we have $xg_{2}(x)\rightarrow0$ as $x\rightarrow\infty$. However, it is easy to see that $\int_{2}^{\infty}\frac{1}{x\ln x}dx=\infty$.
Put it formally, we argue that: There exists a random variable $X:\Omega\rightarrow[0,\infty)$ such that $xP\left(X\geq x\right)\rightarrow0$ as $x\rightarrow\infty$, but $E\left[X\right]=\infty$.
Proof: Choose a sufficiently large $x_{0}>0.$ Define $F:\mathbb{R}\rightarrow[0,1]$ by $$ F(x)=\begin{cases} 1-\frac{1}{x\ln x}, & \mbox{if }x\geq x_{0}\\ 0, & \mbox{if }x<x_{0} \end{cases}. $$ Clearly, $F$ is increasing, right-continuous, $F(x)\rightarrow1$ as $x\rightarrow\infty$, $F(x)\rightarrow0$ as $x\rightarrow-\infty$. Therefore, $F$ is a c.d.f.. By Skorokhod Theorem, we can construct a probability space $(\Omega,\mathcal{F},P)$ and a random variable $X:\Omega\rightarrow\mathbb{R}$ such that $F(x)=P\left(X\leq x\right)$. Clearly, $X$ is non-negative because $P(X\leq0)=F(0)=0$. Also $P(X>x)=1-F(x)=\frac{1}{x\ln x}$ for $x\geq x_{0}$. It follows that $E\left[X\right]=\int_{0}^{\infty}P(X>x)dx\geq\int_{x_{0}}^{\infty}P(X>x)dx=\int_{x_{0}}^{\infty}\frac{1}{x\ln x}dx=\infty$. Moreover, $xP(X>x)\rightarrow0$ as $x\rightarrow\infty$.