I define a function $$f = \max(\mbox{rand}(0, 1), \mbox{rand}(0, 1))$$
such that $f$ returns the maximum (greater number) of two random selected numbers between $0$ and $1$.
Plotting a histogram for 1 million numbers generated this way gives me a distribution that appears "linear"ly increasing:

I'd like to know how this distribution can be derived or quantified.
Let $X$ and $Y$ be independent random variable with distribution uniform on $[0,1]$, and let $W=\max(X,Y)$. For $0\le w\le 1$, we have $$F_W(w)=\Pr(W\le w)=\Pr\left((X\le w)\cap(Y\le w)\right)=w^2$$
For the density function $f_W(w)$ of $W$, differentiate $F_W(w)$. We get $2w$ on the interval $(0,1)$, precisely as the simulation suggests.