Convergence of Random Power series

309 Views Asked by At

Q) Let $X_1,X_2,..$ be i.i.d. and not $\equiv 0$. Show that the radius of convergence of the power series $\sum_{n\geq 1}X_nz^n$ is $1$ a.s. or $0$ a.s. according as $E\text{ log}^+|X_1|<\infty \text{ or } = \infty$, where $\text{log}^+x = \text{max(log }x,0)$ .

I am trying to understand the proof of the question given here here but I fail to understand the portion when $E[\log_{+}(|X_{1}|)]<\infty$ . Btw I am posting this as a new question because this is too long for a comment and it is most likely to be ignored as it is a fairly old post.

It says that $\sum_{n=1}^{\infty}P(\log_{+}(|X_{n}|)\geq \epsilon n)=\infty$ for any $\epsilon>0$ and hence $|X_{n}|< e^{\epsilon n} $ for large $n$ with probability $1$ .

But why is that the case? . If the above holds , then we have by BC lemma 2 that $\log_{+}(|X_{n}|)\geq \epsilon n$ for infinitely many $n$ with probability $1$ .

I think since we need $|X_{n}|<e^{\epsilon n}$ for large $n$ , i.e. for all but finitely mane $n$, we should consider $\sum_{n=1}^{\infty}P(\log_{+}|X_{n}|\geq \epsilon n)$ and show that it converges and hence by BC lemma 1 we would have $|X_{n}|\geq e^{\epsilon n} $ occurs infinitely often with probability $0$, Or in other words $|X_{n}|\leq e^{\epsilon n}$ occurs for all but finitely many $n$ with probability $1$.

Explicitly , I mean that if $A_{n}= \{|X_{n}|< e^{\epsilon n}\}$ then $P(\lim\inf A_{n})=1-P(\lim\sup A_{n}^{c})$ .

But then again, I run into a problem of how to at all show $\sum_{n=1}^{\infty}P(\log_{+}|X_{n}|>\epsilon n)<\infty$ .

I would want to use Markov's inequality, but that would require something like $n^{1+\epsilon}$ .

That is I can atmost show $$\sum_{n=1}^{\infty}P(\log_{+}|X_{n}|>\epsilon n^{1+\epsilon}) \leq \sum_{n=1}^{\infty}\frac{E[\log_{+}X_{1}]}{\epsilon n^{\epsilon+1}}\leq M\sum_{n=1}^{\infty}\frac{1}{n^{\epsilon+1}}<\infty$$ for all $\epsilon>0$ . This would allow me to conclude $\large |X_{n}|<e^{\epsilon n^{1+\epsilon}}$ for all large $n$ .

But this does not really help with the problem as @KaviRamaMurthy 's answer uses $|X_nz^{n}| \leq e^{n\epsilon} |z|^{n}=(e^{\epsilon}|z|)^{n}$ as a term for the geometric series to show convergence. However for any $\epsilon$, we have $\large e^{n^{1+\epsilon}}$ would outgrow $e^{n}$ . In any case I am not using the independence with this approach so there must be something wrong .

Question : Can anybody tell me how to show the required $|X_{n}|<e^{\epsilon n}$ for large $n$ with probability $1$ in this or some other way? .

1

There are 1 best solutions below

2
On BEST ANSWER

With the brilliant hint from @jakobdt I think I solved the question. I invite others to please comment on this and point out any mistakes .

Fix $\epsilon >0$

As $E[|X|]=\int_{0}^{\infty}P(|X|\geq t)dt$ (Provable by a simple application of Tonelli's Theorem) we have :

$$E[\frac{\log_{+}|X_{n}|}{\epsilon}]<\infty \implies \int_{0}^{\infty}P(\frac{\log_{+}(|X_{n}|)}{\epsilon}\geq t)\,dt <\infty$$

This means by the Integral test, $$\sum_{n=1}^{\infty}P(\frac{\log_{+}(|X_{n}|)}{n}> \epsilon)<\infty.$$

This just means that as $$\sum_{n=1}^{\infty} P(|\frac{\log_{+}(|X_{n}|)}{n}|>\epsilon)<\infty$$ we have $\frac{\log_{+}(|X_{n}|)}{n}\xrightarrow{a.s} 0$ . (This is just by a theorem that if $\sum P(|X_{n}-X|>\epsilon)<\infty$ then $X_{n}\to X$ a.s.

But the above implies $$|X_{n}|^{\frac{1}{n}}\leq\exp(\frac{\log_{+}(|X_{n}|)}{n})\xrightarrow{a.s} 1\implies \lim\sup|X_{n}|^{\frac{1}{n}}\leq 1 $$

But this is precesily what the definition of the reciprocal of the radius of convergence is . Hence, $R\geq 1$.

For $|z|>1$ , we have that $P(|X_{n}|> \frac{1}{|z|^{n}})\to P(|X_{1}|> 0)\neq 0$ (Assuming $X$ to be non-constant a.s) . This means that $P(\lim\sup |X_{n}||z^{n}| \geq 1) > 0 $ . This means that with positive probability the series $\sum_{n=1}^{\infty} |X_{n}||z^{n}|$ diverges (as the $n$-th term does not go to $0$).

But since $X_{n}$'s are independent, so By Kolmogorov's zero-one law, we have with probability $1$ , the series diverges.

P.S. I welcome other answers to this problem which might be shorter.