Suppose that $\{X_n\}_{n=1}^\infty$ is a sequence of random variables such that: $$X_1=\lambda\ , \ X_{n+1}\sim\text{Poi}(X_n)$$
(First, we draw $X_n$, if $X_n=k$ then $X_{n+1}\sim Poi(k)$
I am trying to find out whether or not the sequence converges:
- almost surely
- in $\mathcal{L}^1$
- in $\mathcal{L}^2$
Well, by definition, the only thing I can tell about this sequence is that:
$$P(X_{n+1}=k)=\sum_{m=0}^\infty P(X_{n+1}=k|X_n=m)P(X_n=m)=\sum_{m=0}^\infty \frac{e^{-m}m^k}{k!}P(X_n=m)$$
and I don't know how to proceed from here... Am I in the right way? How can I proceed?
Since the mean of $\operatorname{Poi}(\mu)$ is $\mu$, we have
$$ \mathbf{E}[X_{n} \mid X_1,\ldots,X_{n-1}] = X_{n-1} $$
and hence $(X_n)$ is a non-negative martingale. So by the martingale convergence theorem, $(X_n)$ converges a.s. Then by the generalized Borel–Cantelli lemma,
\begin{align*} \mathbf{P}(X_n = 0 \text{ i.o.}) &= \mathbf{P}\Biggl( \sum_{n=1}^{\infty} \mathbf{P}(X_n = 0 \mid X_1,\ldots,X_{n-1}) = \infty \Biggr) \\ &= \mathbf{P}\Biggl( \sum_{n=1}^{\infty} e^{-X_{n-1}} = \infty \Biggr) \\ &= 1, \end{align*}
where the last line follows from the fact that $X_n$ converges a.s. Together this and the observation that $X_{n+k} = 0$ for all $k \geq 0$ whenever $X_n = 0$ (this is because $\operatorname{Poi}(0) = \delta_0$), we find that $(X_n)$ is eventually constant with the value $0$ almost surely. Therefore
$$ \lim_{n\to\infty} X_n = 0 \quad \text{a.s.} $$
This then shows that $(X_n)_{n\geq 1}$ cannot converge in $L^1$ or $L^2$.
Alternatively, given that a.s.-convergence of $(X_n)$ has been established, we may directly prove that $X_{\infty} := \lim X_n = 0$ a.s. Indeed, by the law of iterated expectation, for each $s \geq 0$ we have
$$ \mathbf{E}[e^{-sX_n}] = \mathbf{E}[\mathbf{E}[e^{-sX_{n}}\mid X_{n-1}]] = \mathbf{E}[e^{X_{n-1}(e^{-s}-1)}] = \mathbf{E}[e^{-f(s)X_{n-1}}], $$
where $ f(s) = 1 - e^{-s} $ and the moment generating function formula for the Poisson distribution is utilized in the second step. From this recurrence formula, we get
$$ \mathbf{E}[e^{-sX_n}] = e^{-f^{\circ n}(s)\lambda}. $$
However, by noting that $0 < f(s) < s$ whenever $s > 0$, it is not hard to check that $f^{\circ n}(s) \to 0$ as $n \to \infty$ for any $s \geq 0$. By this and the dominated convergence altogether,
$$ \mathbf{E}[e^{-sX_{\infty}}] = \mathbf{E}[\lim_{n\to\infty} e^{-sX_n}] = \lim_{n\to\infty} \mathbf{E}[e^{-sX_n}] = 1.$$
This implies that $e^{-s X_{\infty}} = 1$ a.s. and hence $X_{\infty} = 0$ a.s. as desired.