Generating function, stopping time and Markov chain

211 Views Asked by At

Suppose we have a simple random walk $ X_{n} $ and let $ f $ be the generating function of $ T_{0} = \min \{n \geqslant 0: X_{n} = 0 \} $ starting at $1$, that is $f(x)=\mathbb{E}_{1}(x^{T_{0}}):=\mathbb{E}(x^{T_{0}}|X_{0}=1)$.

I must prove that $\mathbb{E}_{2}(x^{T_{0}})=f(x)^{2}$. I have tried to decompose: $$\mathbb{E}_{2}(x^{T_{0}})=\sum_{k=0}^{\infty} P_{2}(T_{0}=k)x^{k}$$ I am able to prove that $P_{2}(T_{0}<\infty)=[P_{1}(T_{0}<\infty)]^{2}$ but I don't see how to apply it to my problem. I appreciate any suggestions

2

There are 2 best solutions below

0
On BEST ANSWER

Starting in state $2$, the hitting time $T_0$ can be expressed as $T_1+T_0'$, in which $T_0'$ is $$ \min\{n\ge 1:X_{T_1+n}=0\}. $$ Notice that $T_1$ and $T_0'$ are independent, and $T_0'$ has the same distribution (under $P_2$) as $T_1$ (because $(X_n$ is a random walk) and each has the same distribution as $T_0$ under $P_1$.

1
On

Hint: With the Cauchy product, we can write $$ \left(\sum_{k=0}^\infty P_1(T_0 = k)x^k\right)^2 = \sum_{k=0}^\infty \left(\sum_{p=0}^k P_1(T_0 = p)\cdot P_1(T_0 = k-p)\right) x^k. $$