Given $f_n(x) = \frac{x}{(1+x)^n}$, prove that the series $\sum_{n=1}^{\infty} \frac{x}{(1+x)^n}$ converges on the interval $[1,2]$.
This was an assignment question that I couldn't get. The professor released the solution, but I don't quite understand the steps. I would appreciate some help

Why did he take the absolute value of $\frac{1}{1+x}$? I think he tried to show that the series has a sum that isn't infinite, so it must converge. That part makes sense. I just don't understand this step with the absolute value.
For$x\in[1,2]$, $\quad2\leq1+x\leq3$ . So $$\dfrac13\leq\dfrac{1}{1+x}\leq\dfrac12\implies \dfrac{x}{3^n}\leq\dfrac{x}{(1+x)^n}\leq\dfrac{x} {2^n}$$ So the series $\displaystyle\sum_{n=1}^{\infty} \frac{x}{(1+x)^n}$ converges by comparison test as $\displaystyle\sum_{n=0}^\infty \dfrac{x}{2^n}=2x$ converges for $x\in[1,2]$.
(The absolute value is redundant here since all the terms concerning here are real positive)