A person climbs an infinite ladder. At each jump, the person can jump up one step with probability $1-p$ while the person slips off and falls all the way to the bottom with probability $p$.
a) Represent the person's height on the ladder as a Markov chain and show that the stationary distribution is geometric and find its parameter.
b) What is the long-run proportion of time for which the frog is on the first step above the bottom of the ladder?
c) If the person has just fallen to the bottom, on average how many jumps will it take before he reaches step $k$
So far I have the following:
Let $X_n$ be the step number at time $n$. Because $P(X_{n+1} = i+1| X_n =i) = 1-p$ and $P(X_{n+1}=0 | X_n = i)=p$, we can find the following system of equations such that:
$\pi_0 =p[\pi_0 +\pi_1 +\pi_2 +...\pi_\infty] = p*1$, $\pi_1 = (1-p)\pi_0$, $\pi_2 = (1-p)\pi_1$,... and thus we find the pattern $\pi_n = p(1-p)^n$; however, I'm not sure if this is correct and how to proceed with the subsequent parts, especially given the infinite state space
Part a.
From elementary theory, we know that for the stationary distribution $\pi$, $$\pi P = \pi.$$ From this, we can derive the relation $$\pi_k = (1-p)\pi_{k-1}, \qquad k \geq 1.$$
Additionally, we have that $$p\sum_{i=0}^\infty\pi_i=\pi_0 \Longrightarrow \pi_0 = \frac{p}{1-p}\sum_{i=1}^\infty\pi_i.$$
By the above result and normalizing, we get that $$\sum_{i=0}^\infty\pi_i = \frac{p}{1-p}\sum_{i=1}^\infty\pi_i + \sum_{i=1}^\infty\pi_i = 1 \Longrightarrow \sum_{i=1}^\infty\pi_i = 1-p.$$ Hence, $$\pi_0 = p.$$
Therefore, we have that $$\pi_1 = (1-p)\pi_0 = (1-p)p,$$ and generally $$\pi_k = (1-p)^kp.$$
We note this is a geometric distribution with parameter $p$.
Part b.
This is of course the same as $\pi_1.$
Part c.
This is simply the expected number of steps until $k$ sequential successes. This is a fairly standard problem, e.g., Expected Number of Coin Tosses to Get Five Consecutive Heads
==============================================
EDIT: This is wrong. Misread that you fall "one rung" not to the bottom. Revised version above.
Part a.
From elementary theory, we know that for the stationary distribution $\pi$, $$\pi P = \pi.$$ From this, we can derive the relation $$p\pi_k = (1-p)\pi_{k+1}, \quad k \geq 0.$$
Thus, we have $$\pi_0 = \frac{1-p}{p}\pi_1.$$ We can then show (by induction if you want to be formal) that $$\pi_k = \left(\frac{1-p}{p}\right)^k \pi_0.$$
And since $\pi$ is a probability distribution we know that $$\sum_{k=0}^\infty\pi_k = 1.$$ By substituting in the above answer, we then have that $$\sum_{k=0}^\infty \pi_k = \sum_{k=0}^\infty\left(\frac{1-p}{p}\right)^k \pi_0 = 1 \Longrightarrow \pi_0 = \frac{1}{\frac{1}{1-\left(\frac{1-p}{p}\right)}} = 1 - \left(\frac{1-p}{p}\right).$$
We note, the above sum converges when $p>0.5,$ otherwise the system is not-ergodic (this should be very intuitive, otherwise we obviously drift off until infinity). Thus, we can substitute this back above to find $\pi_k$, i.e., $$\pi_k = \left(\frac{1-p}{p}\right)^k \pi_0 = \left(\frac{1-p}{p}\right)^k \left(1-\left(\frac{1-p}{p}\right)\right).$$ We note this is a geometric distribution with parameter $1 - \left(\frac{1-p}{p}\right).$