I'm having trouble understanding the question below. I understand the continuous time markov chain and unique stationary distribution but not sure what it is asking.
I have a continuous-time Markov chain (X(t))t≥0 with states {1, 2} and evolves as follows
• The expected time, which the process spends in the state i = 1 after having entered this state is one.
• The unique stationary distribution is π = [2/3, 1/3].
Determine the distribution of X(1), if the process starts at the first state X(0) = 1.
Any help would be appreciated.