Infinite oscillation of random signs

813 Views Asked by At

Suppose that $\left(a_n\right)$ is a sequence of real numbers and that $\left(\varepsilon_n\right)$ is a sequence of IID RVs with $$P\left(\varepsilon_n = \pm 1\right) = \frac{1}{2}$$

According to Williams (Probability with Martingales (1991), Section 12.3 "Random signs", pp. 113-114), the results below show that

i. $\sum \varepsilon_n a_n$ converges (a.s.) if and only if $\sum a_n^2 < \infty$, and that

ii. $\sum \varepsilon_n a_n$ (a.s.) oscillates infinitely if $\sum a_n^2 = \infty$

I understand why (i) follows from the results below, but I don't see how to prove (ii).


Theorem 12.2 (Sums of zero-mean independent variables in $\mathcal{L}^2$, pp. 112-113). Suppose that $\left(X_k : k \in \mathbb{N}\right)$ is a sequence of independent random variables such that, for every $k$, $$E\left(X_k\right) = 0,\ \sigma_k^2 := \mathrm{Var}\left(X_k\right) < \infty$$

a. Then $$\left(\sum \sigma_k^2 < \infty\right)\ \textrm{implies that }\left(\sum X_k\ \textrm{converges a.s.}\right)$$

b. If the variables $\left(X_k\right)$ are bounded by some constant $K$ in $\left[0,\infty\right)$ in that $\left|X_k\left(\omega\right)\right| \leq K$, $\forall k, \forall \omega$, then

$$\left(\sum X_k\ \textrm{converges, a.s.}\right)\ \textrm{implies that }\left(\sum \sigma_k^2 < \infty\right)$$

Notes to the theorem

  1. The Kolmogorov $0$-$1$ law implies that $$P\left(\sum X_k\ \mathrm{converges}\right) = 0\ \mathrm{or}\ 1$$

  2. The proof of part b given in the book shows in fact that if $\left(X_k\right)$ is a sequence of independent zero-mean RVs uniformly bounded by some constant $K$, then $$\left(P\left\{\textrm{partial sums of }\sum X_k\ \textrm{are bounded}\right\} > 0\right) \implies \left(\sum X_k\ \textrm{converges a.s.}\right)$$

2

There are 2 best solutions below

0
On BEST ANSWER

The events

$$A := \left\{\sum \varepsilon_k a_k\ \textrm{diverges to }+\infty\right\}$$

and

$$B := \left\{\sum \varepsilon_k a_k\ \textrm{diverges to }-\infty\right\}$$

are both tail events, hence by Kolmogorov's $0$-$1$ law $P\left(A\right),P\left(B\right)\in\left\{0,1\right\}$. However by symmetry $P\left(A\right)=P\left(B\right)$ and since $A \cap B = \emptyset$, we must have $P\left(A\right) = P\left(B\right) = 0$.

0
On

Since $ P(ε_n=±1)=\frac{1}{2}, var(ε_n ) = 1$

$\Sigma var(ε_k a_k)= \Sigma a^2_k \times var( ε_k )= \Sigma a^2_k $