If we pick a sequence of numbers $(a_k)$ at random, what is the expected radius of convergence of $\sum_k a_k x^k$?

145 Views Asked by At

Suppose we pick a sequence of positive integers independently and identically distributed from $\mathbb{N}^+$: call it $(a_k)=(a_0,a_1,a_2,a_3,\ldots)$. If we consider the corresponding generating function $f(x) = \sum_k a_k x^k$, what can we say about the radius of convergence $R$ of $f$? The Cauchy-Hadamard theorem says $R^{-1}= \limsup_{k\to\infty} \sqrt[k]{|a_k|}$, but I'm wondering if we can say any more from a probabilistic standpoint.

Here are my thoughts (mostly low-hanging fruit) on the problem if the $a_k$ are positive integers.

  • If $(a_k)$ is bounded, we have $R=1$; this follows immediately by comparison to the geometric series. I don't think "most" positive integer sequences are bounded; in fact, I suspect they are of measure zero in the set of all such sequences.
  • If $a_k = O(k^r)$ for any real $r$, $R=1$ as well. Likewise, if $a_k = O(M^k)$, $R=M^{-1}$; by the integer stipulation, we must have $M\geq 1$.
  • If the $a_k$ are positive integers, I don't think we can do better than $R=1$.

I thought about how to generalize the problem if we allow the $a_k$ to be real numbers; I haven't thought about the complex case. Here are my thoughts, again somewhat rudimentary.

  • If the $a_k$ are eventually zero, obviously $R=\infty$.
  • We can now have $a_k = O(M^k)$ for any $M>0$ (for instance, the Maclaurin series for tangent gives $R=\pi/2$)
  • Analyzing convergence at the boundary is probably a lost cause

Please feel free to ask for clarification or to change the tags if you think they could be improved.

Update: instead of, "pick a sequence of positive integers independently and identically distributed from $\mathbb{N}^+$," perhaps I should specify a distribution. After looking at several common models, I think a Boltzmann or logarithmic distribution might be best, but I'm not sure. I realize this is an important aspect of the problem and I'm sorry I don't have a better idea of what to ask.

2

There are 2 best solutions below

2
On BEST ANSWER

Let $\mu:=E[a_0]$. Define $Y_i:=a_ix^i$, so that $\mu_i:=E[Y_i]=\mu x^i$ and $\mbox{Var}(Y_i)=x^{2i}\mbox{Var}(a_0)$. The Kolmogorov 2-series theorem states that $\sum_i Y_i$ converges almost surely (is finite, in fact) if $\sum_i\mu_i$ and $\sum_i \mbox{Var}(Y_i)$ both converge.

This reduces to $\sum_{i\geq 0}x^i$ converging (so $R=1$) and $\sum_i x^{2i}$ converging (also $R=1$)

There's a possibility that $R>1$ is possible, but for that you'd need the 3-series theorem and likely more subtle info on the nature of the distribution of $X_i$. You'd also need the 3-series theorem if the sum of expected values or variances diverges to establish convergence.

1
On

Pick a probability distribution $\mathbb{P}$ on the positive integers $\mathbb{N}$, and let $p_n = \mathbb{P}(n)$. For $i \geq 0$, let $A_i$ be independent random variables, positive-integer-valued, with the distribution $\mathbb{P}$, so that $\mathbb{P}(A_i = n) = p_n$. Then the random power series

$$ \sum_{k \geq 0} A_k x^k$$

has a random radius of convergence $R$, which is itself the inverse of the random variable $X$ defined by:

$$X = \limsup_{n \rightarrow \infty} A_n^{1/n}.$$

If $p_0 = 1$ then this power series is boring; it is identically zero. So let's assume $p_0 < 1$. Under this assumption, we find the following about the radius of convergence:

Claim: Suppose $p_0 < 1$. With probability 1, $X \geq 1$, so $R = X^{-1} \leq 1$.

Proof: $X < 1$ only if $A_n = 0$ for all sufficiently large $n$, but since $p_0 < 1$, for any fixed $N \geq 0$,

$$\mathbb{P}(\forall n \geq N, A_n = 0) \leq \mathbb{P}(A_N = A_{N+1} = ... = A_{N+k} = 0) = (1-p_0)^{k+1},$$

and since $k$ was arbitrary, this demonstrates $\mathbb{P}(\forall n \geq N, A_n = 0) = 0$.

You could achieve any desired radius of convergence $< 1$ almost surely by letting $\mathbb{P}$ be supported on a sufficiently sparse set - for example, the set of powers of $2$, or the set of factorials.