Differentiating integrals with respect to variables in the integrator

51 Views Asked by At

Consider the function

$$ G(s) = \int g(t) \, \mu(\mathrm dt,s). $$

Here, $g$ is (say) a measurable function from $[0,1]$ to itself, and for each $s$, $\mu(\cdot,s)$ is a finite measure on $[0,1]$. I'm willing to assume more regularity conditions where necessary. (I suppose some assumptions about $\mu(A,\cdot)$ for each measurable $A$ are needed, but it's not clear to me what those are.)

What is $G^\prime (s)?$ When is there a simple formula beyond taking the definition of the derivative? References would be appreciated.


Suppose it were the integrand instead of the integrator that were a function of $s$, then the answer is standard. That is, if $$G(s) = \int g(t,s) \, \mathrm d \mu(t),$$ then, given suitable regularity conditions, we would be able to differentiate under the integral sign to find

$$ G^\prime(s) = \int \frac{\partial g}{\partial s}(t,s) \,\mathrm d \mu(t). $$

However, in the case I present above, it's not clear how to simplify the derivative beyond the definition.

One case where we might be able to apply the standard toolkit is if there were some reference measure $\nu$ that does not vary in $s$, with $\mu$ having a Radon-Nikodym derivative with respect to $\mu$. In this case, we could calculate:

$$ G^\prime (s) = \int g(t) \frac{\partial}{\partial s} \frac{\mathrm d \mu}{\mathrm d \nu}(t,s) \,\mathrm d \nu(t). $$

An example of this might arise in the context of probability theory. Suppose $X_s$ is a random variable whose distribution is given by $F(\cdot,s)$. $F$ has a density $f(\cdot,s)$ that is differentiable with respect to $s$. Then, we could write

$$ \mathbb E\left[g\left(X_s\right)\right] = \int g(t)f(t,s) \,\mathrm d t $$

so that

$$ \frac{\mathrm d}{\mathrm d s}\mathbb E[g(X_s)] = \int g(t) \frac{\partial f}{\partial s} (t,s) \, \mathrm d t. $$

Is there something to be said about cases more general than this?