Let $k$ be lenght of left interval. $k\in[0,1]$ and right integral is of lenght $1-k$. We define random variable $X$ in such a way as to obtain $P(X=\frac{k}{1-k})=k$. So $X$ represents ratio of lenghts of left to right interval. Now expectancy value is $$\int_0^1k\frac{k}{1-k}dk$$ and it is divergent so expected value does not exist.
To obtain expected value of ratio of longer to shorter interval, we can assume that left one is shorter and then $k\in[0,1/2)$, so we have $$2\int_0^{1/2}k\frac{k}{1-k}dk$$
I have strong suspiction that something went wrong there.
$$\begin{array}{rcl} P(0 \le X \le \frac{k}{1-k}) &=& k \\ \displaystyle \int_0^{\frac{k}{1-k}} f(x) \ \mathrm dx &=& k \\ \displaystyle \frac{f\left(\frac{k}{1-k}\right)}{(1-k)^2} &=& 1 \\ \displaystyle f\left(\frac{k}{1-k}\right) &=& (1-k)^2 \\ \displaystyle f(t) &=& \left(1-\dfrac{t}{1+t}\right)^2 \\ \displaystyle f(t) &=& \left(\dfrac{1}{1+t}\right)^2 \\ \end{array}$$
Therefore: $$\begin{array}{rcl} E(x) &=& \displaystyle \int_0^\infty xf(x) \ \mathrm dx \\ &=& \displaystyle \int_0^\infty \frac x {(1+x)^2} \ \mathrm dx \\ &=& \displaystyle \left(\frac1{x+1}+\ln(x+1)\right)_0^\infty \\ &=& \displaystyle \text{diverges} \end{array}$$
Indeed, I just ran a program in Python to verify that it diverges if we pick a point in $[0,1]$ uniformly.