Consider random variables $X$, $X_1$, $X_2$,... with expectation $0$ and variance $1$ and with distributions $\mu$, $\mu_1$, $\mu_2$,...
Why is it enough to show $$ \int f d\mu_n \rightarrow \int f d\mu$$ for convex 1-Lipschitz continuous functions $f:\mathbb{R} \rightarrow \mathbb{R}$ to prove convergence in distribution $X_n \overset{\mathcal{D}}{\longrightarrow} X$?
More direct proof.
Apply the fundamental theorem of calculus for piecewise-differentiable functions to the function $f(x)=(x-y) 1_{x>y}$ which is convex $1$-Lipschitz and has derivative $f'(x)=1_{x>y}$. Thus $$ (x-y)\cdot 1_{x>y}=\int_{-\infty}^x 1_{t>y}\ dt=\int_{y}^{\infty} 1_{x>t}\ dt. $$ Taking expectations and using Fubini's theorem yields that $$ \mathbb E[(X-y) 1_{X>y}]=\int_{y}^{\infty}\mathbb P(X>t)\ dt. $$
Thus by substituting the function $f(x)=(x-y)1_{x>y}$ into the condition, we find that $$ \lim_{n\to\infty}\int_y^{\infty}\mathbb P(X_n>t)\ dt=\int_y^{\infty}\mathbb P(X>t)\ dt $$ for all $y\in\mathbb R$.
Since CDFs are continuous increasing functions, it follows that $$ \lim_{n\to\infty}\mathbb P(X_n>y)=\mathbb P(X>y) $$ for all $y\in\mathbb R$, giving the convergence in distribution.
Previous iterations of this answer
First suppose that all random variables in question have exponential moments ($\mathbb Ee^{Xt}<\infty$ for all $t$). Observe that if the condition holds for all convex $1$-Lipschitz functions, then in fact it holds for all convex Lipschitz functions (by scaling) and thus for all convex functions (since any convex function can be approximated by Lipschitz functions).
Taking $f=e^{xt}$ in the condition, we see that the moment generating functions converge to that of $X$ and thus we have the convergence in distribution.
Now to handle the case when exponential moments don't exist, fix $m$ large and consider the truncations $Y_n$ of $X_n$ obtained by clamping $X_n$ to $[-m,m]$. That is, $Y_n=\max(\min(X_n,m),-m)$ and similarly for $Y$. Apply the convergence in distribution. Finally, use the convergence in distribution of the truncations to the untruncated variables as $m\to\infty$ to conclude.
[EDIT: I didn't notice the convexity condition when I gave the following answer]
One of the many equivalent definitions of convergence in distribution is that your condition holds for all Lipschitz functions. Thus, we must show that if your condition holds for all $1$-Lipschitz functions, then it necessarily holds for all Lipschitz functions. So take $f$ to be an arbitrary Lipschitz function. Let $C$ be its Lipschitz constant. If $C>1$, then $f/C$ is $1$-Lipschitz.
[END EDIT]