I'm reading the proof of below theorem
Let $X$ be a random variable and $\left\{X_{n}\right\}$ be a sequence of random variables on the probability space $(\Omega, \mathcal{F}, \mathbb{P})$.
$X_{n} \rightarrow X$ in probability if $\mathbb{P}\left(\left|X_{n}-X\right|>\varepsilon\right) \rightarrow 0$ as $n \rightarrow \infty$ for every $\varepsilon>0$.
$X_{n} \rightarrow X$ in law (or in distribution, or weakly) if $\mathbb{E}\left(f\left(X_{n}\right)\right) \rightarrow \mathbb{E}(f(X))$ for every bounded continuous function $f$.
Prove that $X_{n} \rightarrow X$ in probability implies $X_{n} \rightarrow X$ in law.
Here is the proof:
It seems to me the existence of $\delta$ such that $$\left|f\left(X_{n}\right)-f(X)\right|>\varepsilon \implies \left|X_{n}-X\right|>\delta, \quad n \in \mathbb N$$ can not be obtained just by the continuity of $f$. Instead, it would follow if only the uniform continuity is assumed. Hence, the correct hypothesis should be
"... for every bounded and uniformly continuous function $f$"
Could you check if my understanding is correct?

The proof given is wrong. It does assume uniform continuity. However, the statement itself is correct.
One proof uses the fact that convergence in probability implies almost sure convergence for a subsequence. Combined with DCT this gives a proof.
Hints for an alternative proof: There exist $M$ such that $P(|X| >M) <\epsilon$. There exists $n_0$ such that $P(|X_n-X| >1) <\epsilon$ for $n \geq n_0$. Combine these two to get $P(|X_n| >M+1) <2\epsilon$ for $n \geq n_0$. Now $$\begin{aligned} &E|f(X_n)-f(X)|\\={}&E|f(X_n)-f(X)|I_{|X_n| \leq M+1,|X| \leq M+1}+E|f(X_n)-f(X)|I_{|X_n| > M+1 \,\,\text {OR}\,\,|X| > M+1}.\end{aligned}$$
In the first term use uniform continuity of $f$ on $[-M-1,M+1]$. In the second term use a bound on $|f|$. I hope you can finish the proof now.