My probability text has the following theorem:
When a transformation of a random variable $Y = r(X)$ is strictly monotone increasing or strictly monotone decreasing then $r$ has an inverse $s$ and in this case one can show that:
$$f_Y(y) = f_X(s(y))\left|\frac{ds(y)}{dy}\right|$$
In order to derive it my first attempt is taking the derivative of the CDF of X:
$$F_Y(y) = \mathbb{P}(Y \leq y) = \mathbb{P}(X \leq s(y)) = F_X(s(y))$$
$$\frac{d}{dy}F_Y(y) = \frac{d}{dy}F_X(s(y))$$
$$f_y(y) = f_X(s(y))\frac{ds}{dy}$$
In the last step I did chain rule which as far as I know never introduces absolute values. What am I missing here? Googling around it seems like some people refer to the absolute value part as the Jacobian -- which I vaguely remember from calculus as the generalization of the chain rule to higher dimensions -- but since there is only a single dimension here I don't understand why that would come into play.
If $r$ (and hence $s$) is decreasing rather than increasing, then $\mathbb{P}(Y\leq y)=\mathbb{P}(X\geq s(y))$ rather than $\mathbb{P}(X\leq s(y))$. So in that case you get $F_Y(y)=1-F_X(s(y))$ and end up with a minus sign when you differentiate. But also in that case, $\frac{ds}{dy}$ is negative, so you can absorb the minus sign by replacing it with its absolute value.
(Note that this is the same reason that you take the absolute value of the Jacobian in the change-of-variables formula. In the one-dimensional case, instead of writing the absolute value, we customarily adjust the bounds of integration, so that when you make a change of variables by a decreasing variable the bounds now go in the reverse order which introduces a minus sign if you the switch them. If you always kept the bounds of integration so that the lower bound is always smaller than the upper bound, then you would need to use the absolute value of the derivative when making a change of variables.)