I am a bit confused trying to understand what uniform continuity means, from the definition we have:
$$\forall \varepsilon > 0 \; \exists \delta > 0 \; \forall x \in X \; \forall y \in X : \, d_1(x,y) < \delta \, \Rightarrow \,d_2(f(x),f(y)) < \varepsilon$$
Let's say we set $f(x)=x^2$ with $x\in[1,2]$ and $\epsilon=1$. Then we have $|x^2-y^2|<1$, now we must pick a $\delta$ such that for any $|x-y|< \delta$ we have $|x-y|<\delta \implies |f(x)-f(y)|<1$. Is this correct?
I made a "finite experiment" like this: Suppose we pick $y_0=1, y_1=1.1, y_2=1.2,...,$ for each of these values of $y_i$, we must "solve" the following systems of inequalities:
$$|f(x)-f(y_i)|<1 \hspace{1cm} 1\leq x \leq 2 $$
for $x$, and then we get intervals such as:
$$x_i'< x_i < x_i''$$
Which are represented in the plot below, for the different values of $y_i$:
And then we must pick $\delta < \min\{x_1-x_1', x_1''-x_i,x_2-x_2', x_2''-x_2, \dots, x_n-x_n', x_n''-x_n\}$, because with this, we guarantee that whenever $x,y$ is in one of the intervals, $|x-y|<\delta \implies |f(x)-f(y)|<1$. Is my interpretation correct?

$$\def\d{\delta}\def\e{\epsilon}$$ Sometimes people think that we're trying to get the best possible $\d$ for a given $\e$. That's not the point. The big difference is between some $\d$ working vs. no $\d$ working for a given $\e$. Here's a crude approach to proving that $f(x)=x^2$ is uniformly continuous on $[1,2]$. We have $$ |f(x)-f(y)| = |x^2-y^2| = |x-y|\,|x+y| < |x-y| \, 10^{10}, $$ since $|x+y| < 10^{10}$ for all $x,y\in [1,2]$.
Having done some scratch work off to the side we're ready to proceed with our proof. Given $\e>0$ let $\d=\e/10^{10}$. Then for all $x,y\in [0,1]$ we have $$ |x-y| < \d \implies |f(x)-f(y)| < |x-y| 10^{10} < \d \, 10^{10} = \e.$$
We've found a way of handling not just the finite list of inequalities you considered, but all such inequalities simultaneously. Did we get the best $\d$? No! Did we get a $\d$ that worked? Yes!