Let $(X_n)$ be a Markov chain on the probability space $(\Omega,\mathcal{F}, \mathbb{P})$ with state space $S$ and let $T$ be a stopping time.
I need to calculate $\mathbb{E}_x (T)$, for $x\in S$, but actually I'm not able to find a clear definition of it.
What I've understood so far is that it should be something like $\mathbb{E}[T|\mathcal{F}_0]$ (evalueted in $x$, in some sense), where $\{\mathcal{F}_n\}$ is the standard filtration relative to $(X_n)$, but I am pretty confused about domains of definition and measurability of the functions appearing here.
Can anyone give me a clear definition of this object?
It's often better to think of a Markov chain without any particular initial distribution, in which case your chain will simply be defined by its kernel $\{P(x,\cdot)\}_{x\in S}$ (or $\{P_{n,n+1}(x,\cdot)\}_{x\in S,n\in\mathbb N}$ in the inhomogeneous case). If $\mathbb P^x$ is the probability measure corresponding to starting the chain at $x\in S$, then
$$\mathbb P^\mu(\cdot):=\int\mathbb P^x(\cdot)\mu(dx)$$
is the measure corresponding to the chain with initial distribution $\mu$. If your space $S$ is sufficiently nice, e.g. if it is a Polish space, then it is possible to go the other way given some particular initial distribution. However, in that case you need some more advanced machinery such as regular conditional probabilities or disintegration of measures. This way is easier.
Now, the definition of $\mathbb E^x[T]$ should be obvious - it is simply the expectation of $T$ under the probability measure $\mathbb P^x$.