Let $X_n\overset{d}{\to}X$ be a sequence of random variables.
Then for any $g\in C_b(\mathbb{R})$ we have $|\mathbb{E}[g(X_n)]-\mathbb{E}[g(X)]|\to 0$ as $n\to\infty$.
Are there any sufficient and necessary conditions for the following convergence?
$\mathbb{E}[|g(X_n)-g(X)|]\to 0$ as $n\to\infty$ ?
Many Thanks
Suppose that $\Omega = [0,1]$ with Borel sigma-algebra $\mathcal{F}$ and Lebesgue measure $P$. Consider $X_n = \omega$ and $X = 1- \omega$. They are $U[0,1]$. Suppose that $g(x) = x$ for $x \in [0,1]$ and is bounded smooth and so on. Then $$E|g(X_n) - g(X)| = E|2\omega - 1| = E|2 X_1 - 1| = E|2U[0,1]-1| =$$ $$= E|U[-1,1]| = E|U[0,1]| = \frac12 > 0.$$ This example shows why bounded smooth functions $g$ and bounded r.v. $X_n$ are not good enough.
So, the answer is negative.
Very special case: if $X$ is a constant, then if follows that $X_n \to X$ in probability and thus $g(X_n) \to g(X)$ in probability. As $g$ is bounded we have $E|g(X_n) - g(X)|$ by dominated convergence theorem.
So, the sufficient condition is "$X$ is a constant".
The problem with more general conditions is connected with the next fact: if we know the distribution of $X_n$ and do not know anything about $X$ except it's distribution, then there's no information about $(X_n, X)$ and hence about $E(|g(X_n) - g(X)|)$. There's a special case: when X is constant, and it's mentioned above.