I am trying to figure out when for a positive random variable $X_t$ it is ok to write $$ E(\int_a^b X_tdt)=\int_a^b E(X_t)dt $$
I know the theorems about infinite sums an that it is ok to pull $E$ into the sums if the random variables are positive but I am really confused somehow about this one. Only thing I can explain myselfe is that Riemann-sums are used.
Of course this depends on the dependence of $X_t$ on $t$. For Fubini (or Tonelli), we need the map $(\omega,t) \mapsto X_t(\omega)$ to be measurable in the product sigma-algebra.
Any counterexample to Fubini could be adapted to this situation. Here is a simple one.
Sierpinski showed (using to the continuum hypothesis) that there is a set $E \subseteq [0,1]\times[0,1]$ such that
for every $t \in [0,1]$, the set $\{\omega \in [0,1] : (\omega,t) \in E\}$ is countable, but
for every $\omega \in [0,1]$, the set $\{t \in [0,1] : (\omega,t) \in E\}$ is cocountable [i.e. the complement is countable.]
Taking our probability space to be $[0,1]$ with Lebesgue measure, we define $$ X_t(\omega) = \begin{cases} 1,\qquad & (\omega,t) \in E \\ 0,\qquad & (\omega,t) \notin E \end{cases} $$
Then, for each fixed $\omega$ we have $\int_0^1 X_t(\omega)\,dt = 1$ (the integrand is $1$ except on a countable set) and thus $$ \mathbb E \int_0^1 X_t\;dt = 1 . $$
On the other hand, for each fixed $t$ we have $\mathbb E[X_t] = 0$ (the random variable $X_t$ is zero almost surely) and thus $$ \int_0^1\mathbb E(X_t)\;dt = 0 . $$