(convergence and probability) If $X_n$converges on the probability in $X$. Prove...

125 Views Asked by At

If $X_n$ converges on the probability in $X$. Prove

a) (using only the definition of convergence with probability) For every $ \epsilon_k \to 0$ when $k \to \infty$, that there exists a $n_k$ such that $P(|X_n − X| \gt \epsilon_k) \lt \frac{1}{2^k}$ by $n \gt n_k$

b)(using a) and Borell-Cantelli thm 1. Prove that $X_{n_k}$ converges on $X$ with probability $1$.

a)

The definition of convergence with probability

$\lim_{}P(|X_n − X| \gt \epsilon) = 0$ so we can choose any $\epsilon_k$ such that

$P(|X_n − X| \gt \epsilon_k) = 0\ \text{more specifically}\ k=1 \to P(|X_n − X| \gt \epsilon_1) \lt \frac{1}{2^1} \to 0 \lt \frac{1}{2}$

b)

If the sum $P(A_n) \lt \infty$ then

$P(|X_n − X| \gt 1 = A_n) \lt \frac{1}{2^k} \to \sum A_n \le \sum \frac{1}{2^k}$

That was all I was able to think. I know it's incomplete,but what I've done so far is correct? any thought from anyone.

Thanks.

1

There are 1 best solutions below

0
On BEST ANSWER

Part 1.

Sequence $X_n$ converges in probability to $X$ if $$\forall \varepsilon>0: \underset{n\to\infty}{\lim}P(|X_n-X|>\varepsilon)=0\tag{1}$$ It means that (the colon ":" means "such that" or "the following is true") $$\forall \varepsilon>0\,\forall \delta>0\, \exists N\in\mathbb{N}: \forall n\geq N: |P(|X_n-X|>\varepsilon)-0|<\delta$$ or $$\forall \varepsilon>0\,\forall \delta>0\, \exists N\in\mathbb{N}: \forall n\geq N: P(|X_n-X|>\varepsilon)<\delta\tag{2}$$ Since inequality $(2)$ is satisfied (or rather such number $N$ exists) for all $\varepsilon>0$ and $\delta>0$ we can choose any. Let's put restrictions on them:

  1. $\varepsilon=\epsilon_k$, where $\underset{k\to\infty}{\lim}\epsilon_k=0$
  2. $\delta=\frac{1}{2^k}$, where $k\in\mathbb{N}$ and is the same $k$ as for $\varepsilon=\epsilon_k$.

In $(2)$ number $N$ depends on $\delta$ and $\varepsilon$, but now it depends only on $k$ and we can make a notation $N=n_k$.
Then the statement $(2)$ becomes $(3)$: $$\forall k\in\mathbb{N}\,\forall\epsilon_k>0\,\exists n_k: \forall n\geq n_k: P(|X_n-X|>\epsilon_k)<\frac{1}{2^k}\tag{3}$$ and it implies $$\text{For every } \epsilon_k\to 0 \text{ when } k\to\infty \,\, \exists n_k: \forall n\geq n_k: P(|X_n-X|>\epsilon_k)<\frac{1}{2^k}$$

which immediately implies the required statement:

$$\text{For every } \epsilon_k\to 0 \text{ when } k\to\infty \,\, \exists n_k: \forall n>n_k: P(|X_n-X|>\epsilon_k)<\frac{1}{2^k}\tag{4}$$

Bonus tip: We didn't actually need $\underset{k\to\infty}{\lim}\epsilon_k=0$ for $\exists n_k: \forall n\geq n_k: P(|X_n-X|>\epsilon_k)<\frac{1}{2^k}$, the $\epsilon_k$ didn't even had to depend on $k$, moreover, it could be even taken such that $\underset{k\to\infty}{\lim}\epsilon_k=\infty$. But in part 2 we'll need it.

Part 2.

We need to prove that for the sequence $n_k$ defined in part 1 the sequence $X_{n_k}$ converges with probability 1 to $X$, that is, $$P(\{\omega\in\Omega: \underset{k\to\infty}{\lim}X_{n_k}(\omega)=X(\omega)\})=1\tag{5}$$ We'll use Borel–Cantelli lemma:
$$\underset{k=1}{\overset{\infty}{\sum}}P(A_k)<\infty \implies P(\text{"infinitely many $A_k$ occur"})=0$$ Let's define $$A_k=\{\omega\in\Omega:|X_{n_k}(\omega)-X(\omega)|>\epsilon_k\}=\{|X_{n_k}-X|>\epsilon_k\}\tag{6}$$ Inequality $(3)$ is satisfied for any $n\geq n_k$, including the case $n=n_k$, so $$\underset{k=1}{\overset{\infty}{\sum}}P(A_k) = \underset{k=1}{\overset{\infty}{\sum}}P(|X_{n_k}-X|>\epsilon_k) < \underset{k=1}{\overset{\infty}{\sum}}\frac{1}{2^k}=1<\infty$$ Borel–Cantelli lemma implies that $$P(\text{"infinitely many $A_k$ occur"})=0$$ which means $$P(\text{"a finite number of $A_k$ occur"})=1\tag{7}$$ $(7)$ means that there is only a finite number of $k$ for which inequality $$|X_{n_k}-X|>\epsilon_k\tag{8}$$ is satisfied and probability of that is 1. Let $K_1$ be the largest $k$ for which $(8)$ is satisfied. So $$\exists K_1\in\mathbb{N}:\forall k>K_1:|X_{n_k}-X|\leq\epsilon_k\tag{9}$$ Now we finally use $\underset{k\to\infty}{\lim}\epsilon_k=0$ (since we need that $|X_{n_k}-X|$ becomes very small when $k$ increases). It means that for any $\alpha>0$ we can find such $K_2\in\mathbb{N}$ that $$\forall k>K_2:\epsilon_k<\alpha\tag{10}$$ We can take $K=\max(K_1,K_2)$ and then both inequalities $(9)$ and $(10)$ will be satisfied when $k>K$, so $$\forall\alpha>0\, \exists K\in\mathbb{N}:\forall k>K:|X_{n_k}-X|<\alpha\tag{11}$$ $(11)$ means that (by the definition of limit of a sequence) $$\underset{k\to\infty}{\lim}X_{n_k}=X\tag{12}$$ We didn't write $\omega$ or probability, but probabilities of events $(9)$, $(11)$, $(12)$ are equal to $1$ (because we started from $(7)$ and we just did those steps "inside" function $P$, i.e. worked with the argument of that function). So we conclude that $$P(\underset{k\to\infty}{\lim}X_{n_k}=X)=1$$ which is the same as $(5)$.

Bonus tip: For example, let's write probability of event $(9)$ using full notation: $$P(\{\omega\in\Omega:\exists K_1\in\mathbb{N}:\forall k>K_1:|X_{n_k}(\omega)-X(\omega)|\leq\epsilon_k\})=1\tag{9'}$$ Here $\epsilon_k$ isn't specified (it has to be $\epsilon_k>0$, no more restrictions yet), but $K_1$ depends on it and it exists such that the probability of that event is $1$.