Conditions for differentiating under integral sign with mixed finite and infinite limits

67 Views Asked by At

I am working through Problem 2.18 of Statistical Inference by Casella & Berger. It is asking to prove that, for $X$ a continuous random variable with median $m$, we have

$$\min_a \textrm{E}|X-a|=\textrm{E}|X-m|$$

I am taking a calculus approach: let

$$f(a)=\textrm{E}|X-a|=\int_{-\infty}^a (a-x)f_X(x)dx+\int_a^\infty (x-a)f_X(x)dx $$ If we could differentiate under the integral sign then finding the minimum by evaluating $f'(a)$ would be easy. Where I am unsure is how to justify differentiating under the integral sign, e.g. with the first integral:

$$\frac{d}{da}\int_{-\infty}^a (a-x)f_X(x)dx=\int_{-\infty}^a \frac{\partial}{\partial a}(a-x)f_X(x)dx$$ In fact I am not even sure if this is even true, since the upper bound is a function of differentiating variable $a$ (and by Leibnitz's Rule with both limits finite functions of $a$, moving the derivative under the integral sign admits a few more terms on the RHS). The book gives an equation without proof for how to evaluate this derivative with both limits finite functions of $a$ (Leibnitz's Rule) and an equation with conditions (existence of a function $g$ which gives some smoothness guarantee of $f$, and well-behavedness of $g$), but nothing regarding what happens when the limits are mixed. Has such a result been discovered, and if so how can I proceed rigorously with evaluating this integral?