convergence modes

107 Views Asked by At

There are many results on which convergence mode implies the other. For example almost sure convergence implies convergence in probability, implies convergence in Distribution. Now I make some thoughts on this and came up with following question, which I don't know they are true or not.

Let $X_{n},Y_{n}$ be random variables with $\mathbb{P}_{X_{n}} = \mathbb{P}_{Y_{n}}$ for all n. Now I want to prove or disprove the following Claims:

  1. If $X_{n}$ $\rightarrow$ $0$ in probability does this implies $Y_{n} \rightarrow 0$ in probability?
  2. If $X_{n} \rightarrow 0$ almost surely implies $Y_{n} \rightarrow 0$ almost surely?
  3. If $X_{n} \rightarrow 0$ in $\mathcal{L}^{1}$ implies $Y_{n} \rightarrow 0$ in $\mathcal{L}^{1}$?

What are you thinking?

1

There are 1 best solutions below

1
On BEST ANSWER
  1. Yes, that's true. Since $X_n$ and $Y_n$ have, by assumption, the same distribution, we know that $$\mathbb{P}(X_n \in B) = \mathbb{P}(Y_n \in B)$$ for any measurable set $B$. In particular, we can choose $B := (-\infty,-\epsilon) \cup (\epsilon,\infty)$ for fixed $\epsilon>0$ to find that $$\mathbb{P}(|X_n|>\epsilon) = \mathbb{P}(|Y_n|>\epsilon).$$ If $X_n \to 0$ in probability, then the left-hand side converges to $0$ as $n \to \infty$, and so does the right-hand side. Hence, $Y_n \to 0$ in probability.

  2. No, this is, in general, wrong. Consider the sequence of random variables $(Y_n)_{n \in \mathbb{N}}$ on the probability space $((0,1],\mathcal{B}((0,1]))$ (endowed with Lebesgue measure $\lambda$) defined by $$\begin{align*} Y_1(\omega) &:= 1_{\big(\frac{1}{2},1 \big]}(\omega) \\ Y_2(\omega) &:= 1_{\big(0, \frac{1}{2}\big]}(\omega) \\ Y_3(\omega) &:= 1_{\big(\frac{3}{4},1 \big]}(\omega) \\ Y_4(\omega) &:= 1_{\big(\frac{1}{2},\frac{3}{4} \big]}(\omega)\\ &\vdots \end{align*}$$ Then $Y_n$ does not convergence almost surely to $0$ (since for any $\omega \in (0,1]$ and $N \in \mathbb{N}$ there exist $m,n \geq N$ such that $Y_n(\omega)=1$ and $Y_m(\omega)=0$). On the other hand, we can define $$\begin{align*} X_1(\omega) &:= 1_{\big(0,\frac{1}{2} \big]}(\omega) \\ X_2(\omega) &:= 1_{\big(0, \frac{1}{2}\big]}(\omega) \\ X_3(\omega) &:= 1_{\big(0,\frac{1}{4} \big]}(\omega) \\ X_4(\omega) &:= 1_{\big(0,\frac{1}{4} \big]}(\omega)\\ &\vdots \end{align*}$$ For each fixed $n \in \mathbb{N}$, $X_n$ and $Y_n$ have the same distribution, and it follows easily that $X_n \to 0$ almost surely.

  3. Yes, that's true. If we denote by $\mathbb{P}_{X_n}$ the distribution of $X_n$, then $$\mathbb{E}(f(X_n)) = \int_{\mathbb{R}} f(y) \, d\mathbb{P}_{X_n}(y)$$ holds for any measurable function $f: \mathbb{R} \to \mathbb{R}$ such that $f(X_n) \in L^1(\mathbb{P})$. Since $X_n$ and $Y_n$ have the same distribution, this implies $$\mathbb{E}(f(X_n)) = \mathbb{E}(f(Y_n))$$ for any such $f$. If $X_n \to 0$ in $L^1$, then we can choose $f(x)=|x|$ to conclude that $$\mathbb{E}(|Y_n|) = \mathbb{E}(|X_n|) \to 0$$ i.e. $Y_n \to 0$ in $L^1$.