For i.i.d. $\{X_n\}$ with non-constant $X_n > 0$ and $\mathbb{E}X_n = q \leq 1$ the product $\prod\limits_{n=1}^{+\infty}X_n \to 0$ almost surely

129 Views Asked by At

Suppose $\{X_n\}$ is i.i.d. with $X_n > 0$ and each $X_n$ is not constant almost surely.

I want to show that if $\mathbb{E}X_n = q \leq 1$, then the product $\prod\limits_{n=1}^{+\infty}X_n \to 0$ almost surely. In the problem statement, there is also a hint that $q < 1$ and $q = 1$ can be considered separately.

I thought about this one for a long time and I think applying logarithm operation to both sides and transform this product into a sum might help, but I couldn't progress from there. I'm still unable to see how I can reduce this problem to using laws of large numbers.

I would appreciate any help!

2

There are 2 best solutions below

1
On

Since $X_{n}>0$, we have that $q=E[X_{n}]>0$. Define the process $M=\{M_{n}\mid n\in\mathbb{N}\}$ by $M_{n}=\prod_{k=1}^{n}\frac{X_{k}}{q}.$ Let $\mathbb{F}=\{\mathcal{F}_{n}\mid n\in\mathbb{N}\}$ be the raw-filtration induced by $M,$ i.e., $\mathcal{F}_{n}=\sigma\{M_{1},M_{2},\ldots,M_{n}\}$. Note that we also have $\mathcal{F}_{n}=\sigma\{X_{1},X_{2},\ldots,X_{n}\}.$ We go to show that $M$ is a $L^{1}$-bounded $\mathbb{F}$-martingale. Clearly $M_{n}$ is integrable (recall that the product of two independent random variables is integrable). Moreover, since $X_{n+1}$ and $\mathcal{F}_{n}$ are independent while $M_{n}$ is $\mathcal{F}_{n}$-measurable, we have that \begin{eqnarray*} & & E\left[M_{n+1}\mid\mathcal{F}_{n}\right]\\ & = & E\left[\frac{X_{n+1}}{q}M_{n}\mid\mathcal{F}_{n}\right]\\ & = & M_{n}E\left[\frac{X_{n+1}}{q}\mid\mathcal{F}_{n}\right]\\ & = & M_{n}E\left[\frac{X_{n+1}}{q}\right]\\ & = & M_{n}. \end{eqnarray*} Moreover, since $M_{n}>0$, we clearly have $E\left[|M_{n}|\right]=E\left[M_{n}\right]=E[M_{1}]=E[X{}_{1}/q]=1$. This shows that $\sup_{n}E\left[|M_{n}|\right]<\infty.$

By Martingale Convergence Theorem, there exists a $\mathcal{F}_{\infty}$-measurable random variable $\xi$ with $E[|\xi|]<\infty$ such that $M_{n}\rightarrow\xi$ pointwisely a.e. (Note that, in general, we do not have $M_{n}\rightarrow\xi$ in $L^{1}$).

Case 1: $q<1$. We have that $\prod_{k=1}^{n}X_{n}=q^{n}M_{n}\rightarrow0$ a.e. because $q^{n}\rightarrow0$ while $M_{n}\rightarrow\xi$ a.e.

Case 2: Define $a=E\left[\sqrt{X_{n}}\right].$ By Cauchy-Schwarz inequality, we have that \begin{eqnarray*} a & = & \int\sqrt{X_{n}}\cdot1dP\\ & \leq & \left\{ \int X_{n}dP\right\} ^{\frac{1}{2}}\left\{ \int1^{2}dP\right\} ^{\frac{1}{2}}\\ & = & 1 \end{eqnarray*} and the equality holds iff $\sqrt{X_{n}}$ and $1$ are linearly dependent (as elements in $L^2$, which is false since $X_{n}$ is not constant a.e.). Therefore, $0<a<1.$

Define a process $Y=\{Y_{n},\,\,\,n\in\mathbb{N}\}$ by $Y_{n}=\frac{\sqrt{M_{n}}}{a^{n}}.$ It can be proved similarly that $Y$ is a $\mathbb{F}$-martingale. For, by Cauchy-Schwarz inequality, $M_{n}$ is integrable $\Rightarrow$ $\sqrt{M_{n}}$ is integrable. Clearly $Y_{n}$ is $\mathcal{F}_{n}$-measurable. Moreover, \begin{eqnarray*} & & E\left[Y_{n+1}\mid\mathcal{F}_{n}\right]\\ & = & E\left[\frac{\sqrt{X_{n+1}}}{a}\cdot Y_{n}\mid\mathcal{F}_{n}\right]\\ & = & Y_{n}E\left[\frac{\sqrt{X_{n+1}}}{a}\mid\mathcal{F}_{n}\right]\\ & = & Y_{n}E\left[\frac{\sqrt{X_{n+1}}}{a}\right]\\ & = & Y_{n}. \end{eqnarray*} Since $Y$ is non-negative, we have that $E\left[|Y_{n}|\right]=E\left[Y_{n}\right]=E\left[Y_{1}\right].$ It follows that $\sup_{n}E\left[|Y_{n}|\right]<\infty$, i.e., $Y$ is $L^{1}$-bounded. By Martingale Convergence Theorem again, there exists an integrable random variable $\eta$ such that $Y_{n}\rightarrow\eta$ a.e.. Recall that $M_{n}\rightarrow\xi$ a.e. and notice that $M_{n}=a^{2n}Y_{n}^{2}$. Letting $n\rightarrow\infty$, we have that $\xi=0$ a.e. because $a^{2n}\rightarrow0$ while $Y_{n}^{2}\rightarrow\eta^{2}<\infty$ a.e.

1
On

Apologies for my first hasty attempt, perhaps this will be more satisfactory. As you noted, the problem is equivalent to showing that $\sum_i \log(X_i) = -\infty$ a.s.

Firstly, since the logarithm is concave and not affine and $X_i$ is not a.s. constant, one has by Jensen's inequality that

$$0 = \log(1) \geq \log q = \log(E[X_i]) > E[\log(X_i)] := \mu.$$

You can see, e.g. Convexity and equality in Jensen inequality for when Jensen's inequality can be an equality. In our case, it must be strict inequality, which we used in the previous line.

Thus it suffices to show (set $Y_i := -\log(X_i)$) the following:

Theorem. If $Y_i$ are iid with $\mu := E[Y_i] > 0$, then $\sum_i Y_i = \infty$ a.s.

Lemma. For any real valued $(y_i)_i$, if $\frac{1}{n} \sum_{i=0}^n y_i \to \mu > 0$ then $\sum_i y_i = \infty$.

Proof of lemma. Let any subsequence $n_k \to \infty$ be given. Then $$\sum_{i=0}^{n_k} y_i =(\frac{1}{n_k}\sum_{i=0}^{n_k} y_i) n_k \to \mu \times \infty = \infty.$$ In particular this shows $\liminf_n \sum_{i=1}^{n} y_i = \infty$, so $\sum_i y_i = \infty$. QED.

Proof of theorem. If $\mu < \infty$ then by the law of large numbers $\frac{1}{n}\sum_{i=0}^n Y_i \to \mu >0$ a.s. By the lemma, $\sum_i Y_i = \infty$ a.s. in this case.

Now consider the case $\mu = \infty$. Considering that $E[Y_i] > -\infty$ and writing $Y_i = Y_i^+ - Y_i^-$ with $Y_i^+ = \max(Y_i, 0)$ and $Y_i^-= \max(-Y_i, 0)$ the usual decomposition of $Y_i$ into its positive and negative parts, it must be that $E[Y_i^-] \in [0,\infty)$ and $E[Y_i^+] = \infty$. In particular $$ E[\max(Y_i, n)] = E[\max(Y_i^+,n) - Y_i^-] = E[\max(Y_i^+,n)] - E[Y_i^-] \to \infty $$ by monotone convergence of $\max(Y_i^+,n) \to Y_i^+$. In particular we may choose $N$ large enough that $0 < E[\max(Y_i, N)] \leq N$. The random variables $(\max(Y_i, N))_i$ are iid with mean in $(0,\infty)$, so by what we showed earlier $\sum_i \max(Y_i, N) = \infty$ a.s. Then it follows that

$$ \infty = \sum_i \max(Y_i, N) \leq \sum_i Y_i $$ a.s., QED.