What is the average length of a line segment in a $1 \times 1$ field?
Given
$$x_1, y_1, x_2, y_2 \in [0,1]$$
$$S = (x_1,y_1,x_2,y_2)$$
$$dist(x_1,y_1,x_2,y_2) = \sqrt{(x_1-x_2)^2+(y_1-y_2)^2}$$
Find $$\frac{\sum\limits_{n=0}^{\infty} dist(S_n)}{|S|}$$
I know that it's $\approx \boxed{0.52}$ from writing a program that sums up a bunch of random line segment lengths and divides it by the number of samples taken, but what's the closed form?
Assuming that $x_1,x_2,y_1,y_2$ are independent and uniformly distributed over $[0,1]$, then the pdf of both $x_1-x_2$ and $y_1-y_2$ is given by: $$ g(x) = (1-|x|)\cdot\mathbb{1}_{[-1,1]}(x) $$ hence the average length is given by:
$$\begin{eqnarray*}\iint_{[-1,1]^2}\sqrt{a^2+b^2}(1-|a|)(1-|b|)\,da\,db &=&\color{blue}{ \frac{2+\sqrt{2}+5\log(1+\sqrt{2})}{15}}\\&=&\color{red}{0.52140543316472\ldots},\end{eqnarray*}$$ agreeing with the Monte Carlo simulation.