Let $y_n,x_n$ denote two sequences of random variables. Define $$ y_n=\operatorname{E}(y_n\ |\ x_n)+\epsilon_n $$ Does $\operatorname{var}(y_n\ |\ x_n)=o_p(1)\implies \epsilon_n=o_p(1)$?
I tried to use the law of total variance. But then I need uniform integrability to prove the above. I am hoping there is another way which does not require stronger assumption.
An idea. We can use Chebyshev inequality conditioned on $x_n$, reading:
$P({|y_n-E[y_n|x_n]|>\epsilon}|x_n)\le \frac{Var(y_n|x_n)}{\epsilon^2}$
Now take expecations w.r.t. $x_n$ (e.g. multiply by $p(x_n)$ and integrate):
$P({|y_n-E[y_n|x_n]|>\epsilon})\le \frac{E[Var(y_n|x_n])}{\epsilon^2}$
If we can justify that $E[Var(y_n|x_n)]\rightarrow 0$ for n large than we are done, even if maybe we need some more additional hypothesis to finish this way. You have an hypothses that $Var(y_n|x_n)$ goes to zero in probability but I am not sure this always also implies that the expected value goes to zero.