True or false: if $(a_n)_{n=1}^\infty \subseteq \{ -1, 1 \}$ then almost certainly $\sum_{n=1}^\infty \frac{a_n}{n}$ converges

110 Views Asked by At

Suppose we choose a random sequence of signs $a_n \in \{ -1, 1 \}$, with the distribution being according to uniform Lebesgue measure on $\{ -1, 1 \}^\omega \simeq [0, 1]$. I wonder if it's true or false that almost certainly, $\sum_{n=1}^\infty \frac{a_n}{n}$ converges (with the series obviously being either conditionally convergent or divergent in any case).

I'm leaning towards true, with a random sequence almost surely having roughly equal numbers of positive and negative entries as $n \to \infty$. On the other hand, it wouldn't shock me too much to find out the statement is false, or even that the series is almost surely divergent.

(Note this isn't a homework problem or anything, the problem just occurred to me.)

3

There are 3 best solutions below

3
On BEST ANSWER

See Kolmgorov's three series theorem

Theorem Let $(X_n)_{n\in\mathbb N}$ be independent random variables. The random series $\sum_{n=1}^\infty X_n$ converges almost surely in $\mathbb R$ if and only if the following conditions hold for some $A > 0$:
i. $\sum_{n=1}^\infty \mathbb P(|X_n| ≥ A)$ converges.
ii. Let $Y_n:= X_n\;1_{\{|X_n| ≤ A\}}$, then $\sum_{n=1}^\infty \mathbb E(Y_n)$, the series of expected values of $Y_n$ , converges.
iii. $\sum_{n=1}^\infty \mathrm{var}(Y_n)$ converges

Now let's consider what happens in this example. $X_n = \frac{a_n}{n}$ where the $a_n$ are i.i.d. random variables with each value $+1,-1$ occurring with probability $1/2$.
Take $A = 2$. Then $|X_n| \ge A$ has probability $0$ for all $n$. So (i) is trivial.
Next, $Y_n = X_n$ and $\mathbb{E}(Y_n) = 0$, so (ii) is trivial.
Finally, $\mathrm{var}(Y_n) = 1/n^2$, and $\sum\frac{1}{n^2}$ converges. So (iii) is true.

Therefore, the series $\sum X_n$ converges a.s.

[In fact, this precise example is used to illustrate the three series theorem on that Wikipedia page.]

3
On

With the partial sums $\displaystyle s_n=\sum^n_{k=1}a_k$ and $s_0=0$, we have $$\sum^n_{k=1}\frac1k(s_k-s_{k-1})=\sum^n_{k=1}\frac{s_k}k-\sum^{n-1}_{k=1}\frac{s_k}{k+1}=\sum^n_{k=1}\frac{s_k}{k(k+1)}+\frac{s_n}{n+1}.$$ The law of the iterated logarithm says $$\limsup_{n\to\infty}\frac{s_n}{2\sqrt{n\ln\ln n}}=1\quad\mbox{a.s.},$$ so the first term of the RHS converges a.s. for $n\to\infty$, and so does the second one due to the law of large numbers. That's because with the measure described in the question, the $a_k$ are iid random variables with mean $0$ and variance $1$.

2
On

Just for fun let's see how this follows from a bit of basic real analysis, with no fancy probability theory needed.

The Rademacher functions give a concrete realization of a sequence of iid random variables uniformly distributed on $\{-1,1\}$. (The definition at that link is concise but a little slick; to understand what's going on you should check that$$r_1=\chi_{(0,1/2)}-\chi_{(1/2,1)},$$ $$r_2=\chi_{(0,1/4)} -\chi_{(1/4,1/2)}+\chi_{(1/2,3/4)}-\chi_{(3/4,1)},$$ etc, almost everywhere.)

So the assertion is that the sum $\sum\frac1n r_n(t)$ converges for almost every $t\in(0,1)$. Since the $r_n$ are orthonnormal it's clear that the series converges in $L^2$, but that doesn't a priori imply convergence almost everywhere. (For example, if $\sum|a_n|^2<\infty$ then $\sum a_ne^{int}$ converges almost everywhere, but this is a monstrously difficult theorem of Carleson.)

Let $$f(t)=\sum\frac1n r_n(t),$$converging in $L^2$.

Now for each $t$ define a sequence of intervals $I_n(t)$ by the conditions $t\in I_n(t)$ and $$I_n(t)=[j2^{-n},(j+1)2^{-n})\quad(j\in\Bbb Z).$$

The Lebesgue differetiation theorem says that $$f(t)=\lim_{n\to\infty}\frac1{|I_n(t)|}\int_{I_n(t)}f$$almost everywhere.

But the cool thing about the Rademacher functions here is that $\frac1{|I_n(t)|}\int_{I_n(t)}f$ is the $n$-th partial sum of the series.