Probability density function satisfying $\lim_{x\to \infty} xf(x) = \lim_{x\to -\infty} xf(x) $

71 Views Asked by At

Let $X$ be a real-valued random variable with probability density function $f(x)$ (i.e., $f(x) \geq 0, \ \forall x \in \mathbb{R}$ and $\int_\mathbb{R}f(x)dx = 1$), which is also assumed to be twice continuously differentiable on its support. Is the following condition restrictive? $$ \lim_{x\to \infty} xf(x) - \lim_{x\to -\infty} xf(x) = 0 $$ Is it trivial so that it holds true for almost any distribution, or there are many distributions violating this condition? What are its implications on the shape of the density function?

I appreciate any help and guidance!


Background: I'm trying to formulate the first order condition for an extremum estimator of a model, in which $f(x)$ is the pdf of the error term. I find that $\mathbb{E}[\psi(X)X] = -1$ is a necessary identification condition (i.e., my estimation function achieves an extremum at the true parameter value), where $\psi(x) = f'(x)/f(x)$ is the density score. Since $$\mathbb{E}[\psi(X)X] = \int_{-\infty}^{\infty} f'(x)x dx = xf(x)|_{-\infty}^{\infty} - \int_{-\infty}^{\infty} f(x)dx = xf(x)|_{-\infty}^{\infty}-1, $$ this implies $\lim_{x\to \infty} xf(x) = \lim_{x\to -\infty} xf(x)$. But I'm not able to find more primitive conditions for the distribution such that this equation holds. A colleague told me that either $f(x)$ must have support $\mathbb{R}$ or it must be symmetric about the origins. But I don't think this is true, because for example if $X$ follows a continuous uniform distribution on any interval $[a,b]$ with $- \infty< a < b < \infty$, the condition is satisfied.