Interchanging expectation value and derivative

1.8k Views Asked by At

Let $\{X(t)\}$ be a stochastic process and $\{\mu_t\}$ the sequence of its law. I know that the process is bounded by $1$ for every $t$. I would like to prove that

$\frac{d}{dt}\mathbb{E}_{\mu_t}(X(t))=\mathbb{E}_{\mu_t}(\frac{d}{dt}X(t))$.

My idea was to write the derivative as a limit and apply the theorem of the dominated convergence to exchange limit and expectation. Is that correct in your opinion?

@edit Thanks to all of you.

Actually the process is not continuous in time (it is cadlag). What I know is that it satisfies a stochastic differential equation as

$dX(t)= f(X(t))dt+ dM(t)$

where $M(t)$ is a martingale and it is such that $|f(X(t))|<1$ and $|M(t)|<1$ for every $t$.

1

There are 1 best solutions below

0
On

Check out these lecture notes by James Norris entitled "Probability and Measure". In particular, look at Section 3.5 entitled "Differentiation Under the Integral Sign" -- it's on page 24. Theorem 3.5.1 may help; it exactly uses Dominated Convergence, as you suggest. The section after (Section 3.6) also talks about Fubini's theorem, which may be of help.