Is there a simple example of a sequence $X_1,X_2,\ldots$ of random variables such that $X_n \to 1$ almost surely, but $P(X_n\leq 0) \to 1$.
I'm trying to understand the difference between convergence a.s. and in probability on contra-examples.
Is there a simple example of a sequence $X_1,X_2,\ldots$ of random variables such that $X_n \to 1$ almost surely, but $P(X_n\leq 0) \to 1$.
I'm trying to understand the difference between convergence a.s. and in probability on contra-examples.
Copyright © 2021 JogjaFile Inc.
I'll just sum up what I said in the comments into this answer.
Almost sure convergence implies convergence in probability. Hence, what you are asking for is not possible. To be specific $P(X_{n}\leq 0)\leq P(|X_{n}-1|>0.5)$ and $P(|X_{n}-1|>0.5)\to 0$ by definition of convergence in probability. (i.e. $X_{n}\xrightarrow{P}1 $ when $P(|X_{n}-1|>\epsilon)\xrightarrow{n\to\infty} 0$ for each fixed $\epsilon>0$).
In fact, to extend upon @Henry 's comment. You can have $E(X_{n})\to+\infty$ but still $P(X_{n}\leq 0)\to 1$. As an explicit example, consider $B_{n}$ to be $\text{Bernoulli}(\frac{1}{n})$ variates, (Bernoulli Random Variables) and then $X_{n}=n^{2}B_{n}-1$ are random variables such that $E(X_{n})=n-1\to\infty$ . But $P(X_{n}\leq 0)=P(B_{n}=0)=1-\frac{1}{n}\to 1$. If you take $X_{n}=2nB_{n}-1$ , then you get what Henry was suggesting.