Suppose we have non-negative random variables: $X_n$ ($n\in\mathbb N$). For fixed $0<\alpha<\beta$ constants, we know that $$\lim_{n\to\infty}\mathbb EX_n^\alpha=\lim_{n\to\infty}\mathbb EX_n^\beta=c<\infty$$ Does this mean convergence in probability: $X_n\rightarrow c$?
Using Markov's inequality, convergence in probability follows:
$$\mathbb P(|X_n-c|>\epsilon) \le \frac{\mathbb E|X_n-c|}{\epsilon}$$ So I need $\lim_{n\to\infty}\mathbb E|X_n-c|=0$ given the assumptions. I am a bit stuck on that, could you give me a hint?
No, just consider e.g. a Bernoulli-distributed random variable (i.e. $\mathbb{P}(X=1) = \mathbb{P}(X= 0) = \frac{1}{2}$) and define
$$X_n := X \qquad \text{for all $n \in \mathbb{N}$.}$$ Then $(X_n)_{n \in \mathbb{N}}$ is a sequence of identically distributed non-negative random variables, and therefore
$$\mathbb{E}(X_n^{\alpha}) \quad \text{and} \quad \mathbb{E}(X_n^{\beta})$$
do not depend on $n$ for any $\alpha,\beta$. Since
$$\mathbb{E}(X) = \mathbb{E}(X^2) = c :=\frac{1}{2}$$ the assumption $$\lim_{n \to \infty} \mathbb{E}(X_n) = \lim_{n \to \infty} \mathbb{E}(X_n^2)$$ is satisfied. On the other hand, as
$$\mathbb{P}\left( \left|X_n-\frac{1}{2} \right| \geq \frac{1}{4} \right)= \mathbb{P}\left( \left|X-\frac{1}{2} \right| \geq \frac{1}{4} \right)=1$$
for all $n \in \mathbb{N}$, we find that $X_n$ does not converge in probability to $c=\frac{1}{2}$.