I am given the following table:
and I'm asked to show that $X_n \rightarrow X$ converges in both probability and distribution.
I don't know how to do this (I'm extremely new to convergence in probability). Any clue would be highly appreciated.
I am given the following table:
and I'm asked to show that $X_n \rightarrow X$ converges in both probability and distribution.
I don't know how to do this (I'm extremely new to convergence in probability). Any clue would be highly appreciated.
The table seems to provide the joint distribution between $X_n$ and $X$. Convergence in probability means that the probability that the realization of $X_n$ and $X$ are different decays to $0$ as $n\rightarrow \infty$. Which it does. The only case when this happens with probability $1/2n$ if $X=0$, $1/4n$ if $X=1$ or $2$.
$$P(|X_n - X| > \epsilon) \rightarrow 0$$, for all $\epsilon >0$.
Convergence in distribution is a much weaker requirement, asking nothing about particular realizations. In fact $X_n$ and $X$ can have the same distribution but be the opposite of each other, as long as the distribution of $X_n$ converges to $X$.
Example: $X_n$ is exactly equal to $-1^n X$, where $X$ is standard Gaussian. $X_n$ has same distribution as $X$ but the realizations can be off by a large value. Distributionally, it is perfect.