Exercise 5.2 in Elements of Statistical Learning
Goal is to show that an order $M$ B-Spline basis function is the density function of a convolution of $M$ uniform random variables. Although I feel the idea, I am looking for an elegant exhaustive solution. Below my attempts.
We denote $B_{i,m}(x)$ the $i$-th B-Spline basis function of order $m$ for the knot-sequence $\tau$, $m<M$, defined as
$$B_{i,1}(x)=\begin{cases} 1 & \text{if } \: \: \: \tau_{i} \leq x \leq \tau_{i+1}\\ 0 & \text{otherwise} \end{cases}$$
for $i=1,\dotsc ,K+2M-1$
$$B_{i,m}(x)= \frac{x-\tau_{i}}{\tau_{i+m-1}-\tau_{i}} B_{i,m-1}(x) + \frac{\tau_{i+m}-x}{\tau_{i+m}-\tau_{i+1}} B_{i+1,m-1}(x)$$
for $i-1, \dotsc ,K+2M-m$.
The distribution for the sum of $M$ uniform RVs is the convolution of density functions. I read the exercise as "show that the $i$-th B-Spline of order $M$ is the density function for the sum of $M$ uniform RVs".
Using the characteristic function, we can write
$$P_{ X_1 + \dotsb +X_M }(u)=\mathcal{F}^{-1}\!\!\left[ \left( \frac{i(1-e^{it}) }t \right)^M \right]\!\!(u)$$
( Elaboration on how to obtain this last result is welcome. ) After calculation,
$$P_{ X_1 + \dotsb +X_M }(u) = \frac{1}{2(n-1)!}\sum^{M}_{k=0}(-1)^{k}\binom{n}{k}(u-k)^{M-1} \mathrm{sgn}(u-k) \tag{*}$$
I propose to proceed by induction.
$$M=2: \qquad P_{X_{1}+X_{2}}(x)=\begin{cases} x & \text{if } \: \: \: 0 \leq x \leq 1\\ x-2 & \text{if } \: \: \: 1 \leq x \leq 2\\ 0 & \text{otherwise} \end{cases}$$
while
$$B_{i,2}(x)=\begin{cases} \dfrac{x-\tau_{i}}{\tau_{i+1}-\tau_{i}} & \text{if } \: \: \: \tau_{i} \leq x \leq \tau_{i+1}\\\\ \dfrac{\tau_{i+2}-x}{\tau_{i+2}-\tau_{i+1}} & \text{if } \: \: \: \tau_{i+1} \leq x \leq \tau_{i+2}\\\\ 0 & \text{otherwise} \end{cases}$$
Which is the same as $P_{X_{1}+X_{2}}$ up to a change in variable, approximately $X = \dfrac{x-\tau_{i}}{\tau_{i+1}-\tau_{i}}$. More rigorous elaboration on this change in variable or an argument to show the equivalence or both expressions is welcome.
Induction. We assume property true at order $M$.
$$B_{i,M+1}(x)= \frac{x-\tau_{i}}{\tau_{i+M}-\tau_{i}} B_{i,M}(x) + \frac{\tau_{i+M+1}-x}{\tau_{i+M+1}-\tau_{i+1}} B_{i+1,M}(x)$$
How to properly show that $B_{i,M+1}(x)$ expresses as $(*)$ where $M+1$ uniform RVs are added ?
One idea would be to express by induction the density of the sum of $M+1$ RVs in $(*)$ as a function of the density of the sum of $M$ RVs. How to write it properly ?
For convenience, we define that $\tau_1=\tau_2=\cdots=\tau_{M}=0$, $\tau_{M+k}=k,\forall k\in \{1,2,\cdots,K\}$, and $\tau_{M+K+1}=\tau_{M+K+2}=\cdots=\tau_{M+2K}=K+1$, $K\ge M$.
Then we know that $B_{M+k,1}(x-k)$ is the density of $U(0,1)$, $\forall k\in\{0,1,\cdots,K\}$.
Suppose that $B_{M+k,m}(x+k)$ is the density function of a convolution of $m$ random variables ~$U(0,1)$, $\forall 1 \le m \le M$, $k\in\{0,1,\cdots,K-m+1\} $, we want to prove that $B_{M+k,M+1}(x+k)$ is the density function of a convolution of $M+1$ random variables ~$U(0,1)$ by induction, $k \in\{0,1,\cdots,K-M\}$
$$ \begin{equation} \begin{aligned} B_{M+k,M+1}(x+k)&=\frac{x+k-\tau_{M+k}}{\tau_{2M+k}-\tau_{M+k}}B_{M+k,M}(x+k)+\frac{\tau_{2M+k+1}-x-k}{\tau_{2M+k+2}-\tau_{M+k+1}}B_{M+k+1,M}(x+k)\\ &=\frac{x}{M}B_{M+k,M}(x+k)+\frac{M+1-x}{M}B_{M+k+1,M}(x+k)\\ &=\frac{x}{M}B_{M,M}(x)+\frac{M+1-x}{M}B_{M+1,M}(x)\\&=B_{M,M+1}(x)=\frac{x}{M}B_{M,M}(x)+\frac{M+1-x}{M}B_{M,M}(x-1). \end{aligned} \end{equation} $$ Thus, we only need to prove that $B_{M,M+1}(x)$ is the density function of a convolution of $M+1$ random variables ~$U(0,1)$.
By setting $n=M$ in $(*)$ of OP's result, we can easily verify that $$P_{X_1+\cdots+X_{M+1}}(x)=\frac{x}{M}P_{X_1+\cdots+X_{M}}(x)+\frac{M+1-x}{M}P_{X_1+\cdots+X_{M+1}}(x-1),$$ i.e., $B_{M,M+1}(x)$ is the density function of a convolution of $M+1$ random variables ~$U(0,1)$ since $P_{X_1+\cdots+X_{M}}(x)=B_{M,M}(x)$.