Suppose $X_n$ is a sequence of random variables that has the property that $\sup|X_n| \leq 1$ a.s. Then use Doob's decomposition to prove that $\sum_{n\geq 1} X_n$ converges a.s. iff the sum $\sum_{n\geq 1}E[X_n\,|\,X_1,X_2,...,X_{n-1}]$ converges a.s.
When I posted this question, I received 2 votedowns. I assumed that was because I did not provide my thought on the problem. But at first sight, it's really hard to relate this problem to Doob's decompostion. What I have come up with later is to define $Y_n$=$X_n$-$E[X_{n}\,|\,X_1,X_2,...,X_{n-1}]$, and $S_n$=$\sum_{n\geq 1}Y_n$,then since $Y_n$ has mean zero,$S_n$ should be a martingale. Now |$S_n$| or $S_n^2$ is a submartingale, so now we can use Doob's decompostion, say $S_n^2=M_n+A_n$. I want to show $S_n$ is finite. At first thought, if E$\sup S_n^2$ is finite , then $\sup S_n^2$ is fnite a.s. hence $\sup S_n$ is fnite a.s.. By martingale property, E$M_n$ is finite. My question now is: Is it possible to show E$A_n$ is finite? $A_n$ is nondecreasing and predictable. How do I rule out the case where its expectation will go to infinity?
I think the following might work. Define $Y_n$ and $S_n$ as before. $S_n$ is a martingale and $\sup|S_{n+1}-S_n|$ is less or equal to 2. There is a theorem that states P(C union D)=1 where C is the set the limit of Sn exists and is finite and D is the set where $\limsup S_n=+\infty$ and $\liminf S_n=-\infty$. On C the sum of Xn is infinity iff the sum of E[Xn|Fn-1] is infinity, while on D both sums are infinity always. Hence the conclusion follows.
No, the claim on D is false. On D we can have the finite sum of Xn goes beyond [-M,M] infinitely often while the sum of E[Xn|Fn-1] is finite.
Actually, the method described in the last paragraph might work. The problem on D is that the partial sums might not be monotone. But since we have sup|Xn|<=1, Xn+1>=0. Now if we apply the argument in the last paragraph to Yn=Xn+1, then on D, sigma(Yn)=sigma(E(Yn|F(n-1))=+infinity. But the fact sigma(1) is +infinity makes it hard to investigate the behaviors of sigma(Xn) and sigma(E(Xn|F(n-1)).
The sufficiency is false. Indeed, let $X_1,X_2,\dots$ be i.i.d. uniformly distributed on $[-1,1]$. Then $E[X_n\mid X_1,X_2,\dots,X_{n-1}]=0$ in view of independence. However, the series $\sum_{n=1}^\infty X_n$ diverges, as $\limsup X_n =1$ almost surely.
The necessity is true. It seems that there is no simple proof (at least I don't see one). Let me, before giving a reference, just present some thoughts why it is not so easy.
A closest analogue of this statement is the (necessity part of) Kolmogorov two-series theorem, which says that if series $\sum_{n=1}^\infty X_n$ of bounded independent random variables converges, then both $\sum_{n=1}^\infty E[X_n]$ and $\sum_{n=1}^\infty V[X_n]$ converge. A standard proof is: introduce an independent copy $\{X_n'\}$ of $\{X_n\}$, then the series $\sum_{n=1}^\infty (X_n-X_n')$ of independent centered bounded random variables converges. Consequently, $\sum_{n=1}^\infty V[X_n-X_n'] = 2\sum_{n=1}^\infty V[X_n]$ converges by the Kolmogorov one series theorem. Hence, $\sum_{n=1}^\infty (X_n -E[X_n])$ converges by the same theorem, and we deduce the convergence of $\sum_{n=1}^\infty E[X_n]$.
For non-independent bounded random variables, this should generalize to that convergence of series $\sum_{n=1}^\infty X_n$ of bounded random variables implies convergence of both $\sum_{n=1}^\infty Y_n$ and $\sum_{n=1}^\infty E[(X_n-Y_n)^2]$, where $Y_n = E[X_n\mid X_1,\dots,X_{n-1}]$. Trying to generalize the above argument, we need to construct $X_n'$ such that $\{X_n-X_n'\}$ is a martingale difference and then use the generalization of Kolmogorov one series theorem for martingale differences, which does exist and is due to Burkholder. However, there seems to be no way to construct such $X_n'$. An independent copy clearly is not what we want. We could try to set $X_n' = Y_n + Z_n'$ with $Z_n'$ being an independent copy of $X_n - Y_n$, but this neither leads to the required martingale property.
Therefore, another argument is needed, and one is given here starting from page 576. Do ask if you have some questions.
Finally, it might be that the problem was not formulated as it should have been. Namely, for non-negative bounded sequence $\{X_n,n\ge 1\}$ the series $\sum_{n=1}^\infty X_n$ and $\sum_{n=1}^\infty Y_n$ converge or diverge simultaneously. This is Lemma 2 of the above article; it is proved in Doob's monograph.