Exponential of the sum of a stochastic process

183 Views Asked by At

Consider the following exercise, from Lawden's Stochastic Calculus:

Let $X_1 , X_2 , \dots$ be independent, identically distributed random variables with $$ P\{X_j = 1\} = q, \qquad P\{X_j = −1\} = 1 − q. $$ Let $S_0 = 0$ and for $n \ge 1$, $S_n = X_1 + X_2 + \cdots + X_n$. Let $Y_n = e^{S_n}$.

  1. For which value of $q$ is $Y_n$ a martingale?
  2. For the remaining parts of this exercise assume $q$ takes the value from part 1. Explain why $Y_n$ satisfies the conditions of the martingale convergence theorem.
  3. Let $Y_\infty = \lim_{n\to\infty} Y_n$. Explain why $Y_\infty = 0$. (Hint: there are at least two ways to show this. One is to consider $\log Y_n$ and use the law of large numbers. Another is to note that with probability one $Y_{n+1} /Y_n$ does not converge.)
  4. Use the optional sampling theorem to determine the probability that $Y_n$ ever attains a value greater than 100.
  5. Does there exist a $C < \infty$ such that $E[Y_n^2 ] \le C$ for all $n$?

For the first question, we have to check that $E(Y_{n+1}|\mathcal{F}_n)=E(Y_n|\mathcal{F}_n)$, where $\mathcal{F}_n$ is the filtration generated by $X_1,\dots,X_n$. Clearly, $$ \begin{align} E(Y_{n+1}|\mathcal{F}_n) &= E(Y_n e^{X_{n+1}}|\mathcal{F}_n) = Y_n E(e^{X_{n+1}}|\mathcal{F}_n) \\ &= Y_n\left(qe + (1-q)e^{-1}\right). \end{align} $$ Hence $Y_n$ is a martingale iff $$ qe + (1-q)e^{-1}=1, $$ which gives $$ q = \frac{1-e^{-1}}{e - e^{-1}}. $$

For part 2, we have to check that $E(|Y_n|)<C$ for all $n$. I guess this can be computed using the law of the unconscious statistician as: $$ \begin{align} E(|e^{S_n}|)&=E(e^{S_n})=\sum_{k=0}^n e^{2k-n} {n\choose k}q^k(1-q)^{n-k}=e^{-n}\left(qe^2 + (1-q)\right)^n \\ &= e^{-n}e^n = 1, \end{align} $$ where I used the fact that for the choice of $q$ of the point 1 we have: $$ qe^2 + (1-q)=e. $$ And the bound is independent on $n$, so I think to have proved this point.

For part 3, let us consider $\log Y_n = S_n$, and the density function of $S_n$ can be found e.g. by the moment generating function of $X_i$: $$ M_{S_n}(t) = M_{X_i}(t)^n = \left(qt + (1-q)t^{-1}\right)^n $$ and I guess that the limit $Y_\infty\to0$ can be justified by noting that $M_{S_n}(t)\to0$ as $n\to\infty$. Indeed, with the binomial formula: $$ M_{S_n}(t) = \sum_{k=0}^n {n \choose k} q^kt^k (1-q)^{n-k}t^{k-n} $$ this is quite apparent as $q^k(1-q)^{n-k}$ tend to zero and $t$ is in a neighbourhood of 1.

For the point 4, I think it might be simpler to change the question in the equivalent one: what is the probability that $S_n$ ever attains the value $\ceil{\log100}=5$?

By the optional sampling theorem: $$ \begin{align} E(S_n)&=E(S_0)=2q-1 \\ &= 5 P\{S_n=5\} - a P\{S_n=-a\} = 5 P\{S_n=5\} - a(1-P\{S_n=5\}). \end{align} $$ This gives: $$ P\{S_n=5\}=\frac{2q-1+a}{5+a}, $$ so the probability that $S_n$ ever reaches 5 (and that $Y_n$ ever reaches 100) is: $$ \lim_{a\to\infty}\frac{2q-1+a}{5+a}=1, $$ so certainly sooner or later the stochastic process will reach the value 100.

As for the point 5, by repeating a reasoning along the lines of point 2, I would say that such a constant does not exist, but I am not sure.

Any help on this problem would be much appreciated (especially if there are quicker or standard ways to do it).