Mann-Wald theorem for complete convergence

392 Views Asked by At

We say that the sequence of random variables $X_1,X_2,\dots$ is completely convergent to $X$ whenever for any $\varepsilon >0$, it holds that $$\sum_{n=1}^{\infty} \mathbb{P}(| X_n-X| >\epsilon)<\infty.$$

The Mann-Wald (continuous mapping) theorem states that continuous functions preserve limits for the following types of convergence: convergence in distribution, convergence in probability, and almost sure convergence.

But what about complete convergence?

When $X_1, X_2, \dots$ are independent, almost sure convergence and complete convergence are equivalent, and the images of the sequence elements by a continuous function are still independent, so the claim should still hold in this case.

Is there a example to understand how Mann-Wald breaks down for complete convergence when the random variables are not independent?

1

There are 1 best solutions below

0
On BEST ANSWER

Let me first remark that uniformly continuous functions preserve complete convergence. Indeed, this is pretty much trivial from the definition. Suppose $X_n\to X$ completely and $f:\mathbb{R}\to\mathbb{R}$ is uniformly continuous. Given $\epsilon>0$, choose $\delta>0$ such that $|x-y|\leq\delta$ implies $|f(x)-f(y)|\leq\epsilon$. Then $|f(X_n)-f(X)|>\epsilon$ implies $|X_n-X|>\delta$ so $$\sum_n\mathbb{P}(|f(X_n)-f(X)|>\epsilon)\leq \sum_n\mathbb{P}(|X_n-X|>\delta)<\infty.$$

So, to find a counterexample, you need a function that is continuous but not uniformly continuous. Let $f:\mathbb{R}\to\mathbb{R}$ be any function that is continuous but not uniformly continuous. Take an $\epsilon_0>0$ that witnesses the failure of uniform continuity, so there exist sequences $(a_n)$ and $(b_n)$ with $|a_n-b_n|\to 0$ and $|f(a_n)-f(b_n)|>\epsilon_0$ for all $n$. Let $X$ take the value $a_n$ for each $n$ with probability $\frac{1}{n}-\frac{1}{n+1}$. Let $X_n$ be such that $X_n=b_m$ when $X=a_m$ for $m\geq n$ and $X_n=a_m$ when $X=a_m$ for $m<n$.

Then $X_n\to X$ completely since $|a_n-b_n|\to 0$, so for any fixed $\epsilon>0$ we have $|X_n-X|\leq\epsilon$ everywhere for $n$ sufficiently large. However, $f(X_n)$ does not converge to $f(X)$ completely since $$\mathbb{P}(|f(X_n)-f(X)|>\epsilon_0)=\sum_{m\geq n}\left(\frac{1}{m}-\frac{1}{m+1}\right)=\frac{1}{n}$$ and $\sum_n 1/n$ diverges.