Convergence for Random Variables with no unique Limit Function

56 Views Asked by At

so my task is to find out if the random variables defined by $X_n(t) = n \cdot (-t)^n$ with $t \in [0,1]$ converges almost surely against another random variable $X$ in the probability space $([0,1], \mathcal{B[0,1]}, \lambda_{[0,1]})$. In a first step I tried to find $X$ by calculating $\lim_{n \to \infty} X_n(t)$, which gives me the following:

$X = \left\{\begin{array}{ll} 0, &0 \leq t < 1 \\ \pm \infty, & t = 1 \end{array}\right.$

So my question now is if it's even possible for this sequence of random variables to converge at all. I would say no but I'm not sure though cause 1 is a null set for the Lebesgue measure. If I'm correct is it, therefore, ok to conclude that any sequence of random variables with no unique limit function doesn't converge for any probability measure?

1

There are 1 best solutions below

1
On BEST ANSWER

Your calculations are correct, and you can conclude that $(X_n)$ converges almost surely to $X$ by using the definition : $$\begin{align}\lambda_{[0,1]}\left(\lim_{n\to\infty}X_n = X\right) &=\lambda_{[0,1]}\left(t\in[0,1] : \lim_{n\to\infty}X_n(t) = X(t)\right)\\ &=\lambda_{[0,1]}\left([0,1[\right)\\ &= 1\end{align} $$

So $(X_n)$ converges almost surely to the $0$ (constant) random variable.

You are right to be concerned about the "ill-behaved" $t=1$ case, where $X_n$ oscillates to infinity, but as it has measure zero, the convergence of the sequence $(X_n)$ is not affected.