Suppose we have an i.i.d. sequence of random variables $X_n \sim X$, and a sequence $Y_n \to y_0$ in probability, where $y_0$ is a constant. Suppose also that $f : \mathbb{R}^2 \to \mathbb{R}$ is continuous.
First question: Is it the case that
$$\frac{1}{n} \sum_{i=1}^n f(X_i, Y_n) \to \mathbb{E}[f(X, y_0)]$$
in probability? I suspect the answer is no but I am having trouble finding a counterexample.
The result does hold provided we put additional assumptions on $f$. For instance, either
(1) that $f(x, \cdot)$ is $L(x)$-Lipschitz for each $x$, with $\mathbb{E}[L(X)] < \infty$;
or
(2) that $f(\cdot, y) \to f(\cdot, y_0)$ in the supremum norm whenever $y \to y_0$.
Second question: is there a relationship between these two conditions? That is, is one strictly weaker than the other? For instance, if (1) holds with $L(x)$ bounded then it is clear that (2) holds. But how about more generally?