Suppose we pick a sequence of positive integers independently and identically distributed from $\mathbb{N}^+$: call it $(a_k)=(a_0,a_1,a_2,a_3,\ldots)$. If we consider the corresponding generating function $f(x) = \sum_k a_k x^k$, what can we say about the radius of convergence $R$ of $f$? The Cauchy-Hadamard theorem says $R^{-1}= \limsup_{k\to\infty} \sqrt[k]{|a_k|}$, but I'm wondering if we can say any more from a probabilistic standpoint.
Here are my thoughts (mostly low-hanging fruit) on the problem if the $a_k$ are positive integers.
- If $(a_k)$ is bounded, we have $R=1$; this follows immediately by comparison to the geometric series. I don't think "most" positive integer sequences are bounded; in fact, I suspect they are of measure zero in the set of all such sequences.
- If $a_k = O(k^r)$ for any real $r$, $R=1$ as well. Likewise, if $a_k = O(M^k)$, $R=M^{-1}$; by the integer stipulation, we must have $M\geq 1$.
- If the $a_k$ are positive integers, I don't think we can do better than $R=1$.
I thought about how to generalize the problem if we allow the $a_k$ to be real numbers; I haven't thought about the complex case. Here are my thoughts, again somewhat rudimentary.
- If the $a_k$ are eventually zero, obviously $R=\infty$.
- We can now have $a_k = O(M^k)$ for any $M>0$ (for instance, the Maclaurin series for tangent gives $R=\pi/2$)
- Analyzing convergence at the boundary is probably a lost cause
Please feel free to ask for clarification or to change the tags if you think they could be improved.
Update: instead of, "pick a sequence of positive integers independently and identically distributed from $\mathbb{N}^+$," perhaps I should specify a distribution. After looking at several common models, I think a Boltzmann or logarithmic distribution might be best, but I'm not sure. I realize this is an important aspect of the problem and I'm sorry I don't have a better idea of what to ask.
Let $\mu:=E[a_0]$. Define $Y_i:=a_ix^i$, so that $\mu_i:=E[Y_i]=\mu x^i$ and $\mbox{Var}(Y_i)=x^{2i}\mbox{Var}(a_0)$. The Kolmogorov 2-series theorem states that $\sum_i Y_i$ converges almost surely (is finite, in fact) if $\sum_i\mu_i$ and $\sum_i \mbox{Var}(Y_i)$ both converge.
This reduces to $\sum_{i\geq 0}x^i$ converging (so $R=1$) and $\sum_i x^{2i}$ converging (also $R=1$)
There's a possibility that $R>1$ is possible, but for that you'd need the 3-series theorem and likely more subtle info on the nature of the distribution of $X_i$. You'd also need the 3-series theorem if the sum of expected values or variances diverges to establish convergence.