If $X_{n} \to X$ and $Y_{n} \to Y$ in probability, then $f(X_{n},Y_{n}) \to f(X,Y)$ in probability for $f$ continuous

1.4k Views Asked by At

For a bivariate, real continuous function $f:\mathbb{R}^{2} \mapsto \mathbb{R}$, I need to show that $f(X_{n},Y_{n}) \to f(X,Y)$ in probability whenever $X_{n} \to X$ and $Y_{n}\to Y$ in probability.

This is essentially a bivariate version of the Continuous Mapping Theorem for convergence in probability, and so my attempt involves essentially just copying the proof for the univariate case. What I don't realize, however, is if there are any nuances that should creep into this anywhere because all we are given is that $X_{n} \to X$ and $Y_{n} \to Y$ in probability, and not that $(X_{n},Y_{n}) \to (X,Y)$ in probability (although, by convergence of vectors, it makes sense that we would be guaranteed this).

Anyway, here is what I did:

Let $\epsilon > 0$ be given. Then, for any $K \in \mathbb{R}_{+}^{2}$, $$P(|f(X_{n},Y_{n})-f(X,Y)|>\epsilon)\\ \leq P(|f(X_{n},Y_{n})-f(X,Y)|>\epsilon, \, |(X,Y)|\leq K, \, |(X_{n},Y_{n})|\leq K) + P(\{ |X,Y|>K\} \cup \{|X_{n},Y_{n}|>K\}) $$

We note that since $\{(x,y):|(x,y)\leq K\}$ is a compact set, $f$ is uniformly continuous there.

Therefore, $\exists \delta = \delta(\epsilon)$ s.t. $|f(X_{n},Y_{n})-f(X,Y)|\leq \epsilon$ for $|(X_{n},Y_{n})-(X,Y)|<\delta$ (where we consider the set where $|(X,Y)|\leq K$ and $|(X_{n},Y_{n})|\leq K$). And so, $$ \{|f(X_{n},Y_{n})-f(X,Y)|>\epsilon, \, |(X,Y)|\leq K, \, |(X_{n},Y_{n})|\leq K \} \subseteq \{ |(X_{n},Y_{n})-(X,Y)|>\delta \} $$ Thus, we have $$ P(|f(X_{n},Y_{n})-f(X,Y)|>\epsilon) \leq P(|(X_{n},Y_{n})-(X,Y)|>\delta) + P(|(X,Y)|>K)+P(|(X_{n},Y_{n})|>K) \\ \leq P(|(X_{n},Y_{n})-(X,Y)|>\delta) + P(|(X,Y)>K) + P\left(|(X,Y)|>\frac{K}{2}\right) + P\left( |(X_{n},Y_{n})-(X,Y)|>\frac{K}{2}\right)$$

Now, let $\gamma > 0$ be arbitrary, and then we can choose $K$ sufficiently large such that the second and third terms in the last inequality are both less than $\gamma/2$.

Thus, we have, for sufficiently large $K$, $\displaystyle P(|f(X_{n},Y_{n})-f(X,Y)|>\epsilon) \leq P(|(X_{n},Y_{n})-(X,Y)|>\delta) + P\left(|(X_{n},Y_{n})-(X,Y)|>\frac{K}{2}\right)+ \gamma.$

Finally, taking the limit of both sides of this equation and using the fact that $X_{n}-X \to^{P}0$ and $Y_{n}-Y \to^{P} 0$, we have that $P(|f(X_{n},Y_{n})-f(X,Y)|>\epsilon) \leq \gamma$, and since $\gamma > 0$ was arbitrary, we have $f(X_{n},Y_{n})-f(X,Y) \to^{P} 0$, as desired.

Is this a correct proof to show that $X_{n} \to X$ and $Y_{n} \to Y$ in probability implies that $f(X_{n},Y_{n}) \to f(X,Y)$ in probability? If not, then what would be a correct proof? Thank you ahead of time for your time and patience!


UPDATE: As this proof stands, I believe it is incomplete. What I think I need to complete it is a proof of the result that says that the vector $(X_{n},Y_{n}) \to (X,Y)$ in probability if and only if $X_{n} \to X$ in probability and $Y_{n} \to Y$ in probability. Could someone please help me prove this? I think that once I have that, everything else I did here will be okay (please correct me if I'm wrong!) Thank you.

2

There are 2 best solutions below

2
On BEST ANSWER

Answer to your more specific question

One way is obvious since $$P(d(X_n,X) < \epsilon) \ge P(d((X_n,Y_n),(X,Y)) <\epsilon) \to 1.$$ (because if $d((X_n,Y_n),(X,Y)) <\epsilon$ then it follows that $d(X_n,X) < \epsilon$.)

For the other way, note that by triangle, $$d((X_n,Y_n),(X,Y))\le d(X_n,X) + d(Y_n,Y)$$ so that if $d((X_n,Y_n),(X,Y))> \epsilon$ it follows that one of $d(X_n,X)$ or $d(Y_n,Y)$ needs to be greater than $\epsilon/2$ (actually greater than $\epsilon/\sqrt 2$ for Euclidean distance, but it doesn't really matter). So $$ P(d((X_n,Y_n),(X,Y)) >\epsilon) \le P((d(X_n,X) >\epsilon/2)\cup (d(Y_n,Y) >\epsilon/2)) \\\le P(d(X_n,X) >\epsilon/2) + P(d(Y_n,Y) >\epsilon/2) \to 0$$

4
On

Lemma: Let $\{X_{n}\}$ be a sequence of random variables in a probability space and let $X$ be a random variable. Then the following are equivalent:

(a) $X_{n}\rightarrow X$ in probability,

(b) Any subsequence of $\{X_{n}\}$ has a subsequence that converges to $X$ in probability.

Proof: $(a)\Rightarrow(b)$ is obvious.

$(b)\Rightarrow(a)$. Prove by contradiction. Suppose the contrary that there exists $\delta>0$ such that $x_{n}:=P\left([|X_{n}-X|\geq\delta]\right)$ does not converge to $0$. Then $\{x_{n}\}$ has a subsequence $\{x_{n_{k}}\}$ that is bounded away from $0$. That is, there exists $\epsilon>0$ and a subsequence $\{x_{n_{k}}\}$ such that $x_{n_{k}}\geq\epsilon$ for all $k$. Clearly $\{X_{n_{k}}\}_{k}$ does not have a subsequence that converges to $X$ in probability, contradicting to the assumption (b).

//////////////////////////////////////////////////////////////////////////////////////////

For your problem: Let $f(X_{n_{k}},Y_{n_{k}})_{k}$ be an arbitrary subsequence of $f(X_{n},Y_{n})_{n}$. Note that $X_{n_{k}}\rightarrow X$ in probability, so $\{X_{n_{k}}\}_k$ has a subsequence $\{X_{n_{k_{l}}}\}_{l}$ that converges pointwisely to $X$ $P-a.e.$ (a result due to Vitali). $Y_{n_{k_{l}}}\rightarrow Y$ in probability as $l\rightarrow\infty$, so it has a subsequence $\{Y_{n_{k_{l_{j}}}}\}_{j}$ that converges to $Y$ pointwisely $P-a.e.$ Now $X_{n_{k_{l_{j}}}}\rightarrow X$ and $Y_{n_{k_{l_{j}}}}\rightarrow Y$ pointwisely $P$-a.e. Since $f$ is continuous, it follows that $f(X_{n_{k_{l_{j}}}},Y_{n_{k_{l_{j}}}})\rightarrow f(X,Y)$ pointwisely $P$-a.e. and hence $f(X_{n_{k_{l_{j}}}},Y_{n_{k_{l_{j}}}})\rightarrow f(X,Y)$ in probability as $j\rightarrow\infty$. By the lemma, the results follow.

////////////////////////////////////////////////////////////////////

Please do not completely trust my mathematics and one should examine the above argument carefully before accepting it. As I have not played with probability theory for a long time, I may commit mistake without realizing it. Please do not hesitate and point out my mistakes if you find them.