I have a somewhat open-ended question. Let's say I have a sequence of random variables $(X_n: n \geq 1)$ which are neither independent, ergodic, nor identically distributed. Normally I would say that I am completely dead in the water, but let's say that $X_n \overset{d}{\to} X$. Are there any additional assumptions under which I can say that:
$$ \frac{1}{n} \sum_{i=1}^n X_n \;\overset{P}{\to}\; \mathbb{E}X $$
Even if I assume that expectation of the left-hand side converges to $\mathbb{E}X$, I'm stuck thinking about this more generally. Any tips?
EDIT: Thinking about this some more, I feel like making a martingale out of the LHS and then checking under what conditions we have the desired martingale convergence would be a reasonable route to follow. Any thoughts on this?
EDIT 2: per Nate Eldredge's comment below, I need to assume that the expectation of the LHS of the partial-sum object converges to $EX$... it doesn't follow from $X_n \overset{d}{\to} X$.
Revised per OP comments
Here is a writeup on Laws of Large Numbers for non-iid rv. I think Law I on pdf page 4 has what you are looking for:
Given a sequence of square integrable rvs, $X_i$, if the variances are bounded and the covariances are negative or bounded in absolute value, then your normalized sum converges to $E[X]$ in probability.