Suppose $\{X_n(t)\}_{n = 1}^\infty$ is a sequence of i.i.d. nonnegative Levy processes (nonnegative means here: $\forall t \in\mathbb{R}_+ P(X(t) \geq 0) = 1$). Now let’s define a sequence of random variables $\{A_n\}_{n = 0}^\infty$, using the following recurrence: $$A_n = \begin{cases} 1 & \quad n = 0 \\ X_n(A_{n-1}) & \quad n \geq 1 \end{cases}$$
Under what conditions does $\{A_n\}_{n = 1}^\infty$ converge to $0$ almost surely?
If $E(X_1(1)) < 1$ this seems to be true due to Markov inequality and the fact, that $E(A_n) = (E(X_1(1)))^n$, which can be derived from the Fubini theorem. However, I do not know what to do with the case $E(X(1)) \geq 1$.
Note, that the probability generating function method from the classical branching process theory can not be applied here, because $A_n$ are not necessarily discrete.