I am reading a book titled "An Introduction to Stochastic Differential Equations" written by Evans.
In Chapter 4, it introduces the Paley-Wiener-Zygmund stochastic integral as follows
Definition. Suppose $g: [0,T] \to \mathbb{R}$ is continuously differentiable, with $g(0) = g(T) = 0"$. We define $$ \int_0^T g\,dW := -\int_0^T g'W\,dt, $$
Also, a lemma and its proof follow,
Lemma. We have $E\big( \int_0^T g \,dW \big) = 0$.
Proof. $E\big( \int_0^T g \,dW\big) = -E\big( \int_0^T g'W \,dt\big) = -\int_0^T g'E(W(t)) \,dt = 0$
Here I don't understand why the second equality holds. It seems that he uses Fubini's theorem. However, in order to do that, $W(t,\omega)$, which presumably Wiener Process, should be measurable in $\mathbb{R} \times X$, where $X$ is the probability space in which $W(t,\cdot)$ is defined for every $t $.
How can I justify this? Any help would be appreciated!
(Good catch! It's a little surprising that Evans doesn't comments on this point.)
The joint measurability follows because each map $\omega\mapsto W(t,\omega)$ is measurable and $t\mapsto W(t,\omega)$ is continuous for each $\omega$. Using the continuity you can approximate $W$ as follows: $$ W(t,\omega) =\lim_{n\to\infty}\sum_{k=0}^\infty 1_{[k/n,(k+1)/n)}(t)\cdot W(k/n,\omega). $$ (This sum, for a fixed $t$, has only finitely many non-zero terms, so convergence of the series is not an issue.) Each term in the sum is jointly measurable in $(t,\omega)$, hence so it the sum , hence so is the pointwise limit.