My knowledge of measure theory and probability spaces is limited, so please keep it relatively simple.
Let $\{X(t), ~ t \ge 0\}$ be a Markov process on the countable state space $\mathbb{N}_0$ with initial condition $X(0) = 0$. Let $T_j$ be the first time the Markov process hits state $j$.
Say I want to compute the expected time spent in state $i$ before I reach state $j$, i.e.,
\begin{equation} \mathbb{E}\left[ \int_{t = 0}^\infty 1\{X(t) = i, T_j > t\} \,\mathrm{d}t \right], \tag 1 \end{equation}
where $1\{\cdot\}$ is the indicator function. I would like to write that $(1)$ is equal to
\begin{equation} \int_{t = 0}^\infty \mathbb{P}(X(t) = i, T_j > t) \, \mathrm{d}t. \tag 2 \end{equation}
However, I am unsure about what the correct mathematical steps are and what notation I should use. So, my question is, is the following correct and what notation should be improved upon?
My attempt is as follows
\begin{align} \mathbb{E}\left[ \int_{t = 0}^\infty 1\{X(t) = i, T_j > t\} \,\mathrm{d}t \right] &= \int_{\omega \in \Omega} \int_{t = 0}^\infty 1\{X(t,\omega) = i, T_j(\omega) > t\} \,\mathrm{d}t \, \mathrm{d}\mathbb{P}(\omega) \\ &= \int_{t = 0}^\infty \int_{\omega \in \Omega} 1\{X(t,\omega) = i, T_j(\omega) > t\} \, \mathrm{d}\mathbb{P}(\omega)\,\mathrm{d}t \\ &= \int_{t = 0}^\infty \mathbb{P}(X(t) = i, T_j > t) \, \mathrm{d}t, \end{align}
where $\omega$ is an outcome, $\Omega$ is the sample space and $\mathbb{P}(\omega)$ is the probability of outcome $\omega$ and I use Tonelli's theorem for the second equality. By outcome I mean a sample path of the Markov process.