Let $X_n$ converges to $X$ in distribution and $Y_n$ converges to $X$ in distribution. If $P(X_n - Y_n < -\epsilon) \rightarrow 0$ then show that $X_n-Y_n \rightarrow 0$ in probability.
We know that $P[|X_n-Y_n|>\epsilon] = P[X_n -Y_n > \epsilon]+P[X_n-Y_n < -\epsilon]$. So, if it can be shown that $P[X_n-Y_n > \epsilon] \rightarrow 0$ then we are done. But how to show this?
Let us first get motivated by considering the case when $(X_n),(Y_n)$ are uniformly bounded.
In this case we have $EX_n \to EX$ and $EY_n \to EY$ so $E(Y_n-X_n) \to 0$. Now $(Y_n-X_n)^{+} \to 0$ in probability since $(Y_n-X_n)^{+}>\epsilon$ implies $X_n-Y_n <-\epsilon$. By DTC we get $E(Y_n-X_n)^{+} \to 0$. Combined with $E(Y_n-X_n) \to 0$ this gives $E(Y_n-X_n)^{-} \to 0$ which in turn gives $(Y_n-X_n)^{+} \to 0$ in probability. But then $|Y_n-X_n|=(Y_n-X_n)^{+}+E(Y_n-X_n)^{-} \to 0$ in probability and this finishes the proof.
For the general case we can use the fact that convergence in distribution implies tightness. Given $\eta >0$ there exists $\Delta$ such that $P(|X_n|>\Delta) <\eta$ and $P(|Y_n|>\Delta) <\eta$ for all $n$. Now consider the function $h(x)= \begin{array} {c} \Delta \, \,\text {if} \,\,x>\Delta \\ x \,\, \text{if} \, \,-\Delta \leq x \leq \Delta \, \, \\ {-\Delta} \,\, \text{if}\, \,x <-\Delta \end{array} $.
Then you can see that $E(h(Y_n)-h(X_n))^{+}$ and $E(h(Y_n)-h(X_n))$ tend to $0$ which gives $E((h(Y_n)-h(X_n))^{-} \to 0$. So $(h(Y_n)-h(X_n))^{-} \to 0$ in probability and $|h(Y_n)-h(X_n)| \to 0$ in probability. Now $P(X_n-Y_n >\epsilon) \leq 2\eta+ P(|X_n| \leq \Delta, |Y_n| \leq \Delta, h(X_n)-h(Y_n))>\epsilon)$. I will let you finish the proof from here.