Continuous Mapping Theorem with two sequences of random variables?

902 Views Asked by At

The continuous mapping theorem says that for $\{X_n\}, X$ random variables, with $g$ a function with set of discontinuity points $D_g$ such that $Pr[X \in D_g] = 0$, we have the three relations

  1. $X_n \xrightarrow{d}\ X \quad\Rightarrow\quad g(X_n) \xrightarrow{d} g(X)$
  2. $X_n \xrightarrow{p}\ X \quad\Rightarrow\quad g(X_n) \xrightarrow{p} g(X)$
  3. $X_n \xrightarrow{a.s}\ X \quad\Rightarrow\quad g(X_n)\xrightarrow{a.s} g(X)$

However, does this generalize to two random sequences $\{X_n\}, \{Y_n\}$? A quick reading of the proof would suggest that, so long that $g$ is continuous and ${X_n}$, ${Y_n}$ are bounded in probability, we have

2'. $X_n - Y_n \xrightarrow{p}\ 0 \quad\Rightarrow\quad g(X_n) - g(Y_n) \xrightarrow{p} 0 $

3'. $X_n - Y_n\xrightarrow{a.s}\ 0 \quad\Rightarrow\quad g(X_n) - g(Y_n)\xrightarrow{a.s} 0$

However, I have not been able to find such fundamental result anywhere. Is it available in a textbook or an article?

This extension of the continuous mapping theorem would be extremely useful to prove that two estimators are asymptotically equivalent, for example.

1

There are 1 best solutions below

2
On BEST ANSWER

2' is true. Use uniform continuity of $g$ on bounded sets. Details: let $\epsilon >0$. There exists $T>0$ such that $P\{|X_n|>T\} <\epsilon$ and $P\{|Y_n|>T\} <\epsilon$ for all $n$. There exists $\delta >0$ such that $|g(x)-g(y)| <\epsilon$ if $|x-y| <\delta$ and $|x|\leq T,|y| \leq T$. Now $$P\{|g(X_n)-g(Y_n)| > \epsilon\} \leq P\{|X_n| >T\}+P\{|X_n| >T\}+P\{|X_n-Y_n|>\delta \}$$. The last term $\to 0$ as $n \to \infty$ and the first two terms are each $<\epsilon$. 3' is false. Let $X_n=n$ with probability $\frac 1 n$ and $0$ with probability $1-\frac 1 n$. Let $Y_n=X_n+\frac 1 n$. Then $\{X_n\}$ and $\{Y_n\}$ are bounded in probability and $X_n-Y_N \to 0$ almost surely. Let $g(x)=x^{2}$. Then $g(X_n)-g(Y_n)=n^{2}-(n+\frac 1 n )^{2}=-2+\frac 1 {n^{2}}$ with probability $\frac 1 n$. Assuming independence of $\{X_n\}$ the fact that $\sum P\{g(X_n)-g(Y_n) <-1\} =\infty$ shows that $g(X_n)-g(Y_n)$ does not converge to 0 almost surely.