Bounding the size of a typical random geometric series

48 Views Asked by At

Let $(a_i)_i$ be an iid sequence of positive reals. I am specifically imagining $a_0$ as supported on a finite set, but the same analysis should hold in general (assuming finite moments). Consider the following series: $$ b_n = \sum_{i=0}^{n-1} a_{n-1}\cdot a_{n-2} \cdots a_{i+1} $$

I am interested in bounding the size of $b_n$ for large $n$ in a typical case - in particular, whether and with what probability $(b_n)_n$ is bounded, and by what constant. Here are some preliminary thoughts:

Let $\alpha_n = a_0 a_1 \cdots a_{n-1}$. As the product of iid positive reals, $\log \alpha_n$ is the sum of iid reals, and therefore $$ \frac{1}{n}\log \alpha_n \to E(\log \alpha_n) $$

as $n \to \infty$ with probability $1$ by the law of large numbers. Let $g = \exp(E(\log a_0))$ be the "geometric expectation" of $a_0$, in which case the above observation implies that $$ \alpha_n \sim g^n $$ in a typical case. The behavior of the random series $b_n$ is therefore controlled by the size of $g$, and in particular (we expect) $\sup_nb_n < \infty$ will hold with probability $1$ when $g < 1$.

We may rewrite the definition of $b_n$ as

$$ b_n = \sum_{i=0}^{n-1} \frac{a_{n-1}a_{n-2}\cdots a_{i+1}{\color{blue}{a_i\cdots a_0}}}{\color{blue}{a_i\cdots a_0}} = \alpha_n \sum_{i=0}^{n-1}\frac{1}{\alpha_{i+1}} $$

By the law of large numbers we expect $\alpha_n \sim g^n$ for large $n$. Assuming an ideal average case where $\alpha_n = g^n$ for every $n$, we would have the following.

$$ b_n \sim g^n \sum_{i=0}^{n-1}\frac{1}{g^{i+1}} = \frac{g^n}{g}\sum_{i=0}^{n-1}\Big(\frac{1}{g}\Big)^i = \frac{1-g^n}{1-g} $$

We therefore heuristically expect $b_n \lesssim \frac{1}{1-g}$ in a typical case. I would like now to make this heuristic precise, but this is where I am at a loss.

Let $b = \frac{1}{1-g}$. I want to know by how much and with what probability $b_n$ will stray above this typical value of $b$. Does it hold, for instance, that $$ P(b_n < b + \varepsilon \text{ for all suff. large $n$}) = 1 $$ for all $\varepsilon > 0$? And if not, what can be said about the magnitude of the difference $b_n - b$ (or the ratio $b_n/b$) for large $n$? What limit theorems can be applied here?


Sidenotes:

Whether $(b_n)_n$ is bounded is trivial if $P(a_0 < 1-\varepsilon) = 1$, so in this case we are specifically imagining that $P(a_0 > 1) > 0$. Moreover, you will notice the sum which defines $b_n$ is taken over a peculiar ordering of the indices; it would perhaps be more natural to consider a series like $$ \tilde{b}_n = \sum_{i=0}^{n-1} a_0 a_1 \cdots a_i $$ This series is also easier to analyze. In particular, $\tilde{b}_n$ is monotone increasing with $n$ and, in the case that $P(a_0 < 1 -\varepsilon) = 1$, clearly converges almost surely to some finite limit (at most $1/\varepsilon$).

The reason for the order of the indices in the definition of $b_n$ is due to the context in which it arose: considering the composition of random affine maps. If $T_i : \mathbb{R} \to \mathbb{R}$ is a random affine map of the form $T_i(x) = a_i x + 1$, then (a quick calculation shows) the composite map $$ S_n = T_{n-1} \circ T_{n-2} \circ \cdots \circ T_0 $$

is precisely the map $\alpha_n x + b_n$ . The asymptotic behavior of $b_n$ is thus essential for understanding the asymptotic behavior of $S_n$.

This also implies that, rather than monotonically increasing with $n$ as $\tilde{b}_n$ does, $b_n$ behaves more like a random walk on $\mathbb{R}$ with a bias to drift downward.