Is the expectation of the derivative of the distribution function always 0?

745 Views Asked by At

I was wondering if the expectation with respect to an r.v. of the pdf of that r.v. is always 0.
I came to this conjecture by accident, and then tried it out by using pdfs composed of different polynomials. e.g, the normalized version of:

$$ -4x(x-2), 0 <= x <=1 $$ $$ x^3 - 6x^2 + 9x, 1 <= x <= 3 $$

These two polynomials meet at $x=1$ where they both have derivatives of 0, and The expectation of the derivative the resulting pdf from 1 to 3 is zero.
I want to know if there is a general theorem around the expectation of the derivative of distribution functions that states this.

1

There are 1 best solutions below

0
On

What you appear to be doing is finding the expectation of $\ f'\hspace{-0.2em}\left(X\right)\ $, where $\ f\ $ is the probability density function of the random variable $\ X\ $. In the general case (assuming $\ f\ $ is continuous and piecewise differentiable) this will be: \begin{eqnarray} E\left(f'\hspace{-0.2em}\left(X\right)\right) &=& \int_{-\infty}^\infty f'\hspace{-0.2em}\left(x\right)f\left(x\right)dx\\ &=& \int_{-\infty}^\infty \frac{1}{2}\frac{d}{dx}f\left(x\right)^2dx\\ &=& \frac{1}{2}\left(\lim_{x\rightarrow\infty}f\left(x\right)^2-\lim_{x\rightarrow-\infty}f\left(x\right)^2\right)\\ &=& 0\ , \end{eqnarray} since a probability density function must vanish as its argument approaches plus or minus infinity. So in this case your conjecture is true.

This won't work if $\ f\ $ is discontinuous, however (unless you take the "derivative" of $\ f\ $ at its discontinuities to be an appropriate multiple of the Dirac $\delta$-function). If $$\ f\left(x\right) = \begin{cases} 0 & x<0\\ 2x & 0\le x \le 1\\ 0 & 1<x \end{cases}$$ for instance, then $$ E\left(f'\hspace{-0.2em}\left(X\right)\right) = \int_0^1 4xdx = 2\ .$$