Estimate of probability when the sum of iid normal random variables goes below 0

144 Views Asked by At

This is a homework problem from my probability theory course.

Suppose that $X_i$ are iid normal random variables with mean $\mu > 0$ and variance $\sigma^2 > 0$. Given that $$ S_n = S_0 + \sum_{k=1}^{n} X_k\quad (S_0 \geqslant 0), $$ one is asked to prove $$ \mathbb{P}(S_n \leqslant 0) \leqslant \exp\left(-\frac{2\mu S_0}{\sigma^2}\right). $$

Below is my attempt:

Let $T = \inf\{n : S_n\leqslant 0\}$ and the task becomes to estimate $\mathbb{P}(T < \infty)$. What then occurs to me is to construct a martingale out of $S_n$. I know that for nonnegative iid random variables $X_i$, $$ M_n = M_0 \cdot\prod_{i=1}^n X_i\quad (\mathbb EX_i = 1) $$ is a martingale. Then I let $$ Y_k = \exp\left(X_k - \mu - \frac{\sigma^2}{2}\right) $$ such that $Y_k$ is nonnegative and $\mathbb{E}Y_k = 1$. Further, $$ M_n := e^{S_0}\cdot\prod_{k=1}^n Y_k = \frac{\exp(S_n)}{\exp\left(n\mu + \frac{n\sigma^2}{2}\right)} $$ is a martingale and $S_n \leqslant 0 \iff M_n \leqslant \exp\left(n\mu + \frac{n\sigma^2}{2}\right) $.

And it is at this step that I got stuck. Could anyone please give a hint about how to proceed? Thanks in advance to any of your help! :)

1

There are 1 best solutions below

0
On

Here is a (most likely suboptimal) solution using Chernoff's bounding technique :

If the $X_i$ are i.i.d. Gaussians with mean $\mu$ and variance $\sigma^2$, then $S_n$ is Gaussian with mean $n\mu + S_0$ and variance $\sigma^2$. Similarly, $-S_n$ is Gaussian with mean $-n\mu - S_0$ and variance $\sigma^2$. Thus we have $$\begin{align} \mathbb P(S_n\le0) &= \mathbb P(-S_n\ge 0)\\ &= \mathbb P\left(e^{-sS_n}\ge 1\right)\,\,\forall s\in\mathbb R\\ &\le\mathbb E\left[e^{-sS_n}\right]\,\,\text{(Markov's inequality)}\end{align} $$ Now we use the expression of the Moment Generating Function of Gaussian random variables to get $$ \mathbb E\left[e^{-sS_n}\right] = \exp\left(-s(n\mu+S_0) + \frac{s^2\sigma^2}{2}\right) $$ Since this holds for all $s\in\mathbb R$, it is sufficient to prove the desired inequality to find $s$ such that $$ -s(n\mu+S_0) + \frac{s^2\sigma^2}{2} \le \frac{-2\mu S_0}{\sigma^2}$$ or equivalently $$ -s(n\mu+S_0) + \frac{s^2\sigma^2}{2} + \frac{2\mu S_0}{\sigma^2}\le0$$ The discriminant of the corresponding polynomial is $$\begin{align}\Delta &= (n\mu+S_0)^2 - 4\mu S_0\\ &= S_0^2 + n^2\mu^2 +2n\mu S_0 - 4\mu S_0\\ &=S_0^2 + n^2\mu^2 +2\mu S_0(n-2)\end{align} $$ We can see from $\Delta$'s expression that solutions are guaranteed to exist whenever $n-2\ge0$. It thus follows that for all $n\ge 2$ : $$\mathbb P(S_n\le 0) \le \inf_{s\in\mathbb R}\left\{\mathbb E\left[e^{-sS_n}\right]\right\} \le \exp\left(\frac{-2\mu S_0}{\sigma^2}\right) $$ which proves the desired result (up to the $n=1$ case).