Expected time to to reach a state in a MC

125 Views Asked by At

There is a Markov chain that moves on the integers from 1 to $n$, starting from 1.
At every jump, the process, decides to move to another state bigger than the one where it currently is with uniform probability, meaning that if I have $k$ different possible states to chose from, each will have the same chance $\frac{1}{k}$.
What is the expected time to reach state $n$? And what happens if $n$ goes to infinity?

I have been able to demonstrate that the expected time is always bounded, but I really have a difficult time to find a real rule to calculate it for a fixed value of $n$.