Convergence in distribution: Show $ \lim_{n \rightarrow \infty} \sup \mathbb{P} (x^n \in C) \leq \ \mathbb{P}(x \in C) $

314 Views Asked by At

Consider random vectors $(x^n), x \in \mathbb{R}^q$ defined on a probability space $(\Omega,A,\mathbb{P})$. Show if $x^n \overset{d}{\rightarrow} x $, it holds for any closed set $C$ that $$ \lim_{n \rightarrow \infty} \sup \mathbb{P} (x^n \in C) \leq \mathbb{P}(x \in C) $$

My thoughts:

So obviously we have to use $x^n \overset{d}{\rightarrow} x $. So here is the definition of convergence in distribution:

We say that $(x^n) \in \mathbb{R}^q$ converges in distribution to $x \in \mathbb{R}^q$ if for the corresponding distribution functions $F_n$ and $F$, respectively, and for every continuity point $a \in \mathbb{R}^q$ of $F$, it holds that $\lim_{n \rightarrow \infty} F_n(a) = F(a)$ .

And $C$ closed means that $C^c$ ( complement of $C$ ) is open. Maybe this could be helpful? Maybe we could show $ \lim_{n \rightarrow \infty} \inf \mathbb{P} (x^n \in O) \geq \mathbb{P}(x \in O) $ ( $O$ open set ) to verify the claim above. But how can I show it? And how can I use it to show $ \lim_{n \rightarrow \infty} \sup \mathbb{P} (x^n \in C) \leq \mathbb{P}(x \in C) $ ? I know that $C$ is closed if and only if $C^c$ is open. Moreover I know that $\mathbb{P(A)} = 1-\mathbb{P(A^c)} $. Or do you have another simple idea to solve this exercise?

Update: I've almost showed that $ \lim_{n \rightarrow \infty} \inf \mathbb{P} (x^n \in O) \geq \mathbb{P}(x \in O) $, can someone explain me why $\lim_{n \rightarrow \infty} \inf \mathbb{1}_O(x^n) \geq 1_O(x) $ is true for an open set $O$? My assumption that $ \lim_{n \rightarrow \infty} \inf \mathbb{P} (x^n \in O) \geq \mathbb{P}(x \in O) $ ( $O$ open ) implies $ \lim_{n \rightarrow \infty} \sup \mathbb{P} (x^n \in C) \leq \mathbb{P}(x \in C) $. According to a lecture I found : This follows upon noting that $C$ is closed if and only if $C^c$ is open, and that $\mathbb{P}(C) = 1 − \mathbb{P}(C^c)$ Can someone explain me this in detail?

2

There are 2 best solutions below

0
On BEST ANSWER

The updated part can be regarded as the dual version of the proposition. For clarity, I state and prove it as follows.

Let $O\subseteq\mathbb{R}^{d}$ be an open subset. Then $\liminf_{n}P\left(\left[X_{n}\in O\right]\right)\geq P\left(\left[X\in O\right]\right)$.

Proof : Let $C=\mathbb{R}^{d}\setminus O$, then $C$ is a closed set. Note that $\mu_{n}(O)=1-\mu_{n}(C)$ and $\mu(O)=1-\mu(C)$. We have that: \begin{eqnarray*} \liminf_{n}\mu_{n}(O) & = & \liminf_{n}\left(1-\mu_{n}(C)\right)\\ & = & 1+\liminf_{n}\left(-\mu_{n}(C)\right)\\ & = & 1-\limsup_{n}\mu_{n}(C)\\ & \geq & 1-\mu(C)\\ & = & \mu(O). \end{eqnarray*} That is, $\liminf_{n}P\left(\left[X_{n}\in O\right]\right)\geq P\left(\left[X\in O\right]\right)$.

1
On

Let $\mu_{n}$ be the probability measure on $(\mathbb{R}^{d},\mathcal{B}(\mathbb{R}^{d}))$ induced by the random vector $X_{n}$. That is, $\mu_{n}(A)=P\left(X_{n}^{-1}(A)\right)$, $A\in\mathcal{B}(\mathbb{R}^{d})$. Let $\mu$ be the probability measure on $\mathbb{R}^{d}$ induced by $X$. Recall one equivalent definition of convergence in distribution: $X_{n}\rightarrow X$ in distribution $\iff$ For each bounded continuous function $\phi:\mathbb{R}^{d}\rightarrow\mathbb{R}$, $\int\phi\,d\mu_{n}\rightarrow\int\phi\,d\mu$.

Let $C$ be a non-empty closed subset of $\mathbb{R}^{d}$. For each $m\in\mathbb{N}$, define $U_{m}=\cup\{B(x,\frac{1}{m})\mid x\in C\}$, where $B(x,r)$ denotes the open ball centered at $x$ with radius $r$. Note that $U_{m}$ is open, $U_{1}\supseteq U_{2}\supseteq\ldots\supseteq C$ and $C=\cap U_{m}$. (It follows from the well-known property that for any non-empty closed set $C\subseteq\mathbb{R}^{d}$ and $x\notin C$, $d(x,C)=\inf\{d(x,y)\mid y\in C\}>0.$ For, there exists a sequence $(y_n)$ in $C$ such that $d(x,y_n)\rightarrow d(x,C)$. Clearly $(y_n)$ is bounded. Choose a convergent subsequence $(y_{n_k})$ such that $y_{n_k}\rightarrow y$ for some $y$. Since $C$ is closed, we have $y\in C$. It follows that $d(x,C)=d(x,y)>0$.) Apply Urysohn Lemma on the pair $(C,U_{m})$, we obtain a continuous function $\phi_{m}:\mathbb{R}^{d}\rightarrow[0,1]$ such that $\phi_{m}=1$ on $C$ and $\phi_m=0$ on $(U_m)^c$. Clearly $\chi_{C}\leq\phi_{m}$ and $\phi_{m}\rightarrow\chi_{C}$ pointwisely. We remark that $\phi_{m}$ is bounded continuous and hence $\lim_{n}\int\phi_{m}\,d\mu_{n}=\int\phi_{m}\,d\mu$.

For any $m,n\in\mathbb{N}$, we have $$ \int\phi_{m}\,d\mu_{n}\geq\int\chi_{C}\,d\mu_{n}. $$ Taking $\limsup_{n}$ on both sides yields \begin{eqnarray*} \int\phi_{m}\,d\mu & = & \lim_{n}\int\phi_{m}\,d\mu_{n}\\ & \geq & \limsup_{n}\int\chi_{C}\,d\mu_{n}. \end{eqnarray*} Now let $m\rightarrow\infty$, by Lebesgue Dominated Convergence Theorem, then we have $\int\chi_{C}\,d\mu\geq\limsup_{n}\int\chi_{C}\,d\mu_{n}$. That is, $\limsup_{n}P\left(\left[X_{n}\in C\right]\right)\leq P\left(\left[X\in C\right]\right)$.