For some application, we have the following three assumptions about a sequence of Random Variables $X_n$ (with values in $\mathbb{R}^+$, $n \geq 1$:
There exists a $X \geq 0$ such that
a) $X_n \overset{d}{\rightarrow} X$
b) $E[X_n] \rightarrow E[X]$
c) $E[X_n^2] \rightarrow E[X^2]$
So I know that $X_n \overset{L^p}{\rightarrow} X$ implies that the convergence is in probability and hence in distribution. But I am not sure how $L^p$ convergence relates to the convergence of the moments.
So does any of a)-c) imply the other? In the lecture, it was mentioned that if we assume some property, then we have to assume the beforementioned properties as well. So does for example b) make sense without a)? Or does b) even imply a)? Are these in b) and c) weaker or stronger than or unrelated to convergence in $L^1$, $L^2$, respectively?
If we would assume that $X_n$ take values in $\mathbb{N}$, which implications do then hold?
Thank you very much for your help.