I'm studying from some probability notes. After proving that convergence in probability implies convergence in distribution, an exercise asks the reader to come up with an example of random variables $X,\,X_1,\, X_2,\,\ldots$ defined on the same probability space, such that $X_n\to X$ in distribution and $X$ is independent of the sequence $X_1,\,X_2,\,\ldots\,$. Then the author says this exercise should kill all hope of obtaining any stronger notion of convergence from convergence in distribution, even with the assumption that all random variables are defined on the same probability space. There are two things about this that bother me:
First, this notes already worked with sequences of i.i.d. random variables, so isn't the above exercise trivial as we could take $X,\,X_1,\,X_2,\,\ldots$ to be any i.i.d sequence of random variables?
And second, does any notion of convergence imply a relation of dependence between the limiting random variable $X$ and the approximating sequence $X_1,\,X_2,\,\ldots$? The author seems to imply that stronger convergence modes and dependence are somewhat correlated. Do $X$ and $X_1,\,X_2,\,\ldots$ need to be dependent if $X_n\overset{L^p}{\longrightarrow}X\,$ or $\,X_n\overset{a.s.}{\longrightarrow}X\,$?