Let $Y_{1}, Y_{2}, \ldots$ be i.i.d. random variables that only take positive integer values. For $i=1,2, \ldots,$ let $p_{i}:=\mathbb{P}\left(Y_{1}=i\right) .$ Suppose that $\mu:=\mathbb{E}\left[Y_{1}\right]<\infty .$ Define a sequence of random variables $X_{n}$ as follows: $$ X_{n}=\inf \left\{m \geq n: m=Y_{1}+\ldots+Y_{k} \text { for some } k \geq 0\right\}-n $$ For every $n=1,2, \ldots,$ further define a function $$ f(n):=\mathbb{P}\left(n=Y_{1}+\ldots+Y_{k} \text { for some } k \geq 0\right) $$
Prove that $\left\{X_{n}\right\}_{n=0}^{\infty}$ forms a Markov chain and find the transition probabilities.
Find the necessary and sufficient condition for the limit $\lim _{n \rightarrow \infty} f(n)$ to exist. Please state the necessary and sufficient condition in terms of $p_{1}, p_{2}, \ldots$ and prove your statement.
When the limit $\lim _{n \rightarrow \infty} f(n)$ exists, find the limit. Express it in terms of $\mu$ and/or $p_{1}, p_{2}, \ldots$
I have no idea since the pdf of Y is not given, It is hard to get the distribution of X, Anyone any idea?Thanks