What is the average "measure" inside an $N$ dimensional ball of radius $1$? Here "measure" is the length, area, volume, and so on of $N$ points in $N$-dimensional space.
For two dimensions my problem is what is the average distance between two random points in a radius $1$ disk. My friend Jay told me about this version of this problem and that the answer for this is $\frac{128}{45\pi}$ but I don't know how to verify this.
For three dimensions my problem is what is the average area of a triangle defined by three random points in a radius $1$ ball. For an $N$-dimensional ball of radius $1$, what is the average "measure" of the an $(N - 1)$-dimensional triangle defined by $N$ random points in the $N$-dimensional ball?
I also would like to know why for two dimensions it is supposedly $\frac{128}{45\pi}$. I know a little bit of calculus 1 and 2 from high school if that is needed to understand this problem.
$\def\d{\mathrm{d}}\def\R{\mathbb{R}}\def\x{\boldsymbol{x}}\def\0{\mathbf{0}}\def\abs#1{\left|#1\right|}\def\paren#1{\left(#1\right)}\def\C{\paren{\sum\limits_{\smash{j = 1}}^n C_j^2}^{\frac{1}{2}}}$This partial answer tries to show the complexity of the problem for general $n$ by deriving the formula for the content of an $(n - 1)$-dimensional body formed by $n$ points in the $n$-dimensional space.
For $\x_1, \cdots, \x_n \in \R^n$ with a general configuration, suppose $\x_i = (x_{i, 1}, \cdots, x_{i, n})$ for all $i$ and define$$ X = \begin{bmatrix} \x_1 \\ \vdots \\ \x_n \end{bmatrix} = \begin{bmatrix} x_{1, 1} & \cdots & x_{1, n}\\ \vdots & \ddots & \vdots\\ x_{n, 1} & \cdots & x_{n, n} \end{bmatrix}. $$ It is well-known that the content of the $n$-dimensional body formed by $\0, \x_1, \cdots, \x_n$ is $V = \dfrac{1}{n} |\det X|$, and the equation of the $(n - 1)$-dimensional hyperplane passing through $\x_1, \cdots, \x_n$ is\begin{gather*} \det\begin{bmatrix} \x - \x_1 \\ \x_2 - \x_1 \\ \vdots \\ \x_n - \x_1 \end{bmatrix} = \begin{vmatrix} x_1 - x_{1, 1} & x_2 - x_{1, 2} & \cdots & x_n - x_{1, n}\\ x_{2, 1} - x_{1, 1} & x_{2, 2} - x_{1, 2} & \cdots & x_{2, n} - x_{1, n}\\ \vdots & \vdots & \ddots & \vdots\\ x_{n, 1} - x_{1, 1} & x_{n, 2} - x_{1, 2} & \cdots & x_{n, n} - x_{1, n} \end{vmatrix} = 0. \tag{1} \end{gather*} Denoting the coefficient of $x_j$ in (1) by $C_j$ for each $j$, then the distance between $\0$ and the hyperplane is known to be$$ d = \frac{1}{\C} \abs{ \det\begin{bmatrix} \0 - \x_1 \\ \x_2 - \x_1 \\ \vdots \\ \x_n - \x_1 \end{bmatrix} } = \frac{1}{\C} \abs{ \det\begin{bmatrix} -\x_1 \\ \x_2 \\ \vdots \\ \x_n \end{bmatrix} } = \frac{|\det X|}{\C}, $$ which implies that the content of the $(n - 1)$-dimensional body formed by $\x_1, \cdots, \x_n$ is$$ m = \frac{nV}{d} = \C. $$
To simplify the expression of $C_j$'s, note that\begin{align*} C_1 &= \begin{vmatrix} x_{2, 2} - x_{1, 2} & \cdots & x_{2, n} - x_{1, n}\\ \vdots & \ddots & \vdots\\ x_{n, 2} - x_{1, 2} & \cdots & x_{n, n} - x_{1, n} \end{vmatrix}\\ &= \begin{vmatrix} 1 & x_{1, 2} & \cdots & x_{1, n}\\ 0 & x_{2, 2} - x_{1, 2} & \cdots & x_{2, n} - x_{1, n}\\ \vdots & \vdots & \ddots & \vdots\\ 0 & x_{n, 2} - x_{1, 2} & \cdots & x_{n, n} - x_{1, n} \end{vmatrix}\\ &= \begin{vmatrix} 1 & x_{1, 2} & \cdots & x_{1, n}\\ 1 & x_{2, 2} & \cdots & x_{2, n}\\ \vdots & \vdots & \ddots & \vdots\\ 1 & x_{n, 2} & \cdots & x_{n, n} \end{vmatrix} = \sum_{i = 1}^n A_{i, 1}, \end{align*} where $A_{i, j}$ is the $(i, j)$-th cofactor of $X$. It can be derived analogously (although with more complex notations) that $C_j = \sum\limits_{i = 1}^n A_{i, j}$ for all $j$, thus$$ d = \C = \paren{ \sum_{j = 1}^n \paren{ \sum_{i = 1}^n A_{i, j} }^2 }^{\frac{1}{2}}. $$
Therefore, the expectation to be computed is$$ \frac{1}{B_n^n} \mathop{\intop\cdots\intop}\limits_{\|\x_1\|, \cdots, \|\x_n\| \leqslant 1} \paren{ \sum_{j = 1}^n \paren{ \sum_{i = 1}^n A_{i, j}(\x_1, \cdots, \x_n) }^2 }^{\frac{1}{2}} \,\d\x_1\cdots\d\x_n, $$ where $B_n = \dfrac{π^{\frac{n}{2}}}{Γ\paren{ \frac{n}{2} + 1 }}$ is the content of the unit $n$-ball, and $\d\x_i = \d x_{i, 1}\cdots\d x_{i, n}$ for each $i$.