Let suppose I flip a biased coin (infinite number of times), and at any given time I'm interested in the difference between T and H, that came up until now.
(but not really) I want to have $\xi_n$ - to be the flip result ($-1$ for heads with probability $q$ and $1$ for tails with probability $p$).
And i define $X_{n+1} = \begin{cases}X_n+\xi_n, & \text{if } X_n \geq 1 \\ \max\{0, \xi_n\}, & otherwise \end{cases}$.
Is this some kind of Random Walk?
And i'm interested then what is the expected distance of Walk from $0$?
Is this $\mathbb{E}\left(\lim\limits_{n \to \infty}\frac{1}{n} \sum\limits_{i=1}^{n} X_n\right)$ ?
HINT
This is an infinite Markov chain with $X_n$ as the indicator of the actual state at the $n^{th}$ step. The state transition matrix is as shown below.
The stationary state probabilities are
$$\pi_i=(1-\alpha)\alpha^i$$ where $i=0,1,2,\dots$ and $\alpha=\frac {p}{1-p}.$ For an explanation see this page. If $p=\frac12$ then all the transition probabilities are the same and are $0$s at the same time, so ...
Assume that $p\not =\frac12$. If $X$ denotes the actual state at a random moment then $E[X]$ can be calculated as it is the expectation of a random variable of geometric distribution. If $p=\frac12$ then this expectation is infinite
I just hope that the average of the $X_n$s converges to $E[X]$. If not then my hint flunks.
My intuition tells that $E[X]$ measures the average of the product of the state number and the average time in this state. The average of $X_n$ ought to measure the same.
If $p\geq \frac12$ then intuition tells that the expectation above is $\infty$.