X is a uniform random variable on (0,1) and $Y=\sqrt{X}$. We can find $F_Y(y)$ and $f_Y(y)$
$ F_Y(y) = \begin{cases} 1 & \text{if $1 \leq x$} \\ y^2 & \text{if $0<x<1$} \\ 0 & \text{if $x \leq 0$} \end{cases} $
and so
$ f_Y(y) = \begin{cases} 2y & \text{if $0<x<1$} \\ 0 & \text{otherwise} \end{cases} $
I would like to know if there is a relationship between the fact that $Y=g(x)=\sqrt{X}$ is increasing at a decreasing rate and the nature of $F_Y(y)$ and $f_Y(y)$. That is $g(.4)-g(.2) > g(.6)-g(.4) > g(.8)-g(.6)$ even though all of the intervals are the same size for x.
I heard something in a lecture years ago about this and it made sense at the time, but looking at my notes I can't figure out why I wrote "$f_Y(y)=2y$ since $Y=g(x)=\sqrt{X}$ becomes more dense as X increases"
I was really impressed with this since it has like 5 stars next to it.
Any help would be gratefully received. Googling things like "density of random variables transformation" has not been effective.
Naturally this is a general principle I'm looking for that applies to any case where g(x) increases at a decreasing rate.