Let's start with a slightly trivial Markov chain defined as follows: the beginning state is called $1$ and the set of states is $\mathbb{N}$. At each step, when the current state is $n$, the probability the state changes to $n+1$ is $f(n)$ for some function $f: \mathbb{N} \to [0,1]$ with $f(1) =1$. Otherwise the states remains unchanged.
To be more descriptive, the initial state is $1$. At the first step, the state changes to $2$ (with probability $f(1)=1$, so it always does). Then at the second step, the state changes to $3$ with probability $f(2)$ but otherwise stays there.
Call this first chain "chain 1". If one wants to know what's the expected value of the state after $k$ steps, there seems to be a trick:
Let $T_{n,n+1}$ be the random variable for the time to go from state $n$ to $n+1$.
Then $\mathbb{E}(T_{n,n+1}) = 1/f(n)$.
Next, let $t_k = \mathbb{E} \big( \sum_{i=1}^{k-1} T_{n,n+1} \big) = \sum_{i=1}^{k-1} \mathbb{E}(T_{n,n+1}) $.
Then $\mathbb{E}(X_{t_k}) = k$.
So it suffices to invert the function $k \mapsto t_k$ to get a good idea of the answer.
$\mathbf{Sub-Question:}$ Is there a name to the above trick? or a reference where this is done rigorously?
Now look at this other chain, called "chain 2". The initial state is $1$. Then, at each step, when the state is at $n$,
the probability the state changes to $n+1$ is $w(n)$ for some function $w: \mathbb{N} \to [0,1]$ with $w(1) =1$;
the probability the state changes to $n-1$ is $l(n)$ for some function $l: \mathbb{N} \to [0,1]$ with $l(n) < w(n)$ and $l(n)+w(n) \leq 1$;
Otherwise the states remains unchanged.
$\mathbf{Question:}$ Is the expected value of "chain 2" (for $w$ and $l$ fixed) the same as the expected value of "chain 1" for $f(n) = w(n) - l(n)$.
No name, since the "trick" does not exist. A proper definition would include a definition of $E(X_{t_k})$ when $t_k$ is not an integer.
No, for example $E_0(X_3)$ differs for chain 1 and for chain 2 when $w(n)=\frac12(1+f(n))$ and $l(n)=\frac12(1-f(n))$ for $0\leqslant n\leqslant 2$.