Say we have n iid RV distributed uniformly across (0,2). Let $X_n = U_1U_2U_3\cdots U_n$. In which of the senses (almost surely, in probability, mean square, distribution) does $X_n$ converge?
I know I have to follow the definition for each of the types of convergence and see if $X_n$ meets them. But what do I substitute for $X_n$ and $X$ in the definitions? For example, for convergence in probability, $P(|X_n - X|>\epsilon) = 0$ as $n \to \infty$.
What would I use for $X_n$? For $X$, I typically use the values/distributions I expect the sequence to converge to, but I have no clue what that would be in this case.
You can see that $(X_n)_{n\in\mathbb{N}}$ is a martingale because $\mathbb{E}(X_{n+1}|\sigma(X_1,..,X_n)) = X_n\mathbb{E}(U_{n+1}) = X_n$ and this martingale is positive then converge a.e. (en.wikipedia.org/wiki/Doob%27s_martingale_convergence_theorems) to a random variable $X_\infty$ (hence in probability too). Using that $\mathbb{E}(U_n^2) = \frac{4}{3}$, you can demonstrate that $X_n$ doesn't converge in $L^2$.