prove this martingale inequality

142 Views Asked by At

The problem is like this:

Let $Y_1,Y_2,\ldots$ be nonnegative i.i.d. random variables with $E(Y_m)=1$. Let $X_n=\prod_{m\leq n} Y_m$, show that $\lim_{n\rightarrow \infty}X_n=0$ if $P(Y_m=1)<1$.

What I have reached is that $E(log(Y_m))<\log(E(Y_m))=0$, we denote it with a constant $c<0$.

But to prove the conclusion I will have to prove that $\sum_{m\leq n}\log Y_m=-\infty$ and then $e^{\sum_{m\leq n}\log Y_m}=0$. However the conclusion I have is just abou tits expectation, and SLLN should be used here but I don't know how. Any help?

1

There are 1 best solutions below

0
On

For simplicity, lets say that we have two constants, $a,b: a\geq 1, b\leq 1$ and that $P(Y=a)=p, P(Y=b)=(1-p)=q$ with the constraint that $ap+bq=1=(a-b)p+b \implies p=\frac{1-b}{a-b}$

Now, $\lim_{n\rightarrow \infty}X_n=0\; a.s. \iff P(\lim_{n\rightarrow \infty}X_n>0)=0$. As you pointed out, that is equivalent to requiring $\lim_{n\rightarrow \infty}\sum_{m\leq n}\log Y_m=-\infty\; a.s.$

With our simplified model, we have $P(\log Y_m = \log a>0) =p, P(\log Y_m=\log b <0)=q$.

We can easily see that if $b=0$ then $\lim_{n\rightarrow \infty}X_n=0\; a.s.$ since $P(X_m>0)=P(\{Y_i>0\;\;\forall i\leq m\})=\frac{1}{a^m}\rightarrow0$, so the event $\{\lim_{n\rightarrow \infty}X_n>0\}$ is not a tail event relative to the sequence of $Y_m$.

Now, what about $0<b<1$? Let $L_n:=\sum_{m\leq n}\log Y_m=\sum_{m\leq n} \left[\log a B_{p,j}+\log b (1-B_{p,j}) \right] $, where $B_p \sim Ber(p)$

$E[L_n]=n\left[p\log a + q\log b\right]$ and $Var[L_n]=n(\log a-\log b)^2pq$

The $\log Y_i$ are iid and have finite mean and variances, so using the Kolmorogov SLLN, we can conclude that $\frac{L_n}{n} \to p\log a + q\log b\;\; a.s.$.

What can we say about $p\log a + q\log b$? $a=\frac{1-b}{p} + b=\frac{1-b(1-p)}{p}=\frac{1-bq}{p} \implies p\log(\frac{1-bq}{p})+q\log b = \log\left(\frac{(1-bq)^pb^q}{p^p} \right)<\log\left(\frac{(1-q)^p}{p^p} \right)=\log\left(\frac{(1-(1-p))^p}{p^p} \right)=0$

Therefore, the mean value of $\log Y_n$ will always be negative, except for the case where $a=b=1$ (i.e., $P(Y_m=1)=1)$ and therefore, the sum will diverge almost surely to $-\infty$, where we can talk about the sum of the variables (and not just the expected value), due to the SLLN. This is the connection between the mean of the sum and the behavior of the sample paths.

At a more general level, the event $\{\lim_{n\rightarrow \infty}\sum_{m\leq n}\log Y_m=-\infty\}$ is a symmetric event relative to a sequence of iid random variables. Hence, you can invoke the Hewitt-Savage 0-1 Law to conclude that this event either happens or does not happen almost surely. Thus, by less formal reasoning, you can see that when $b=0$, the event happens almost surely, whereas when $b=1$ (implying $a=1$), it occurs almost never. Thus, there is some threshold value for $p$ where this probability switches. However, the value of $p$ is continuous, thus it seems that either $p=1$ or $p=0$ is the location of this transition. Since $X$ is defined by multiplication and bounded from below, it is hard to see why say, $p=10^{-10^{10}}$ would