I'm reading this book Bayesian Data Analysis from Gelman and on page 52 he makes a change of variables:
This is a very basic calculus question, but I'm a little rusty, someone could reminds me why this is true or show me a good resource to find the prove of this?

It is the simple way to get a $1\rightarrow1$ transformation when the transformation function is monotonic.
Let $X\sim f_X(x)$ and $Y=g(X)$ with $g$ continuous and increasing (better, not decreasing)
Thus
$$F_Y(y)=P(Y\leq y)=P(g(X) \leq y)=P(X\leq g^{-1}(y))=F_X(g^{-1}(y))$$
derivating you get
$$f_Y(y)=f_X(g^{-1}(y))\cdot \frac{d}{dy}g^{-1}(y)$$
your statement is the same as the one I showed you; you can write
$$f_Y(y)=f_X(g^{-1}(y))\left|\frac{dx}{dy}\right|$$
or
$$f_Y(y)=f_X(x)\left|\frac{1}{\frac{dy}{dx}}\right|$$
...as you prefer.
the absolute value is due to the fact that if you take $g$ monotonic but decreasing the result is the same...
Example:
$$f_X(x)=3x^2$$
$x \in(0;1)$
$$Y=-\log X$$
thus
$x=e^{-y}$; $|x'|=e^{-y}$
and
$$f_Y(y)=3 e^{-2y}e^{-y}=3 e^{-3y}$$
that is $Y\sim \exp(3)$