Moment matching uniform distribution on interval with a discrete distribution

285 Views Asked by At

Let $U$ be the uniform distribution on $[0,1]$. I want a distribution with finite support that approximates $U$. Namely, it should match the first $n$ moments of $U$. Formally:

Problem. Given $n$, find a distribution $X_n$ supported on a finite $S_n \subset [0,1]$ such that, for all $k \in \{1,2,\cdots,n\}$, $$\mathbb{E}[X_n^k] = \mathbb{E}[U^k] = \frac{1}{k+1},$$ where $U$ is the uniform distribution on $[0,1]$.

I am hoping that there is a simple and explicit distribution. That is, something I can work with.


For example, when $n=1$, we can have $X_1$ be uniform on $\{0,1\}$. In the case $n=2$ or $n=3$, we can set $\mathbb{P}[X_3=0]=\mathbb{P}[X_3=1]=1/6$ and $\mathbb{P}[X_3=1/2]=2/3$.

1

There are 1 best solutions below

0
On

I will proof the existence of such solution for all $n$ by using orthogonal polynomials : Let $P_n$ be the family of orthogonal polynomials associated to the $L^2([0,1],d\lambda)$ (i.e) $$ \left\{ \begin{array}{lll} deg(P_n)=n \\ \displaystyle\int_0^1 P_n(t)P_m(t)dt=\delta_{n,m} & \textrm{Where }& \delta_{n,m}=\left\{\begin{array}{lll} 1 & \textrm{if} &n=m\\0 & \textrm{if} & n\neq m\end{array}\right.\\ P_n^{(n)}(0)\geq 0. \end{array} \right. $$ The existence of such polynomials can be proved by applying Gram-Schmidt process to the family $(1,X,X^2,X^3,\dots)$ or explicitly on function of moment $(s_k)$ by : $$ P_0(x)=1 \\ P_n(x)=\frac{1}{\sqrt{D_{n-1}D_n}}\left|\begin{array}{cccc}s_0 & s_1 & \dots & s_n\\s_1 & s_2 &\dots & s_{n+1}\\ \vdots & \vdots & \vdots & \vdots \\ s_{n-1} & s_n & \dots & s_{2n-1}\\ 1 & x & \dots & x^n \end{array}\right| $$ Where $D_n=\det{(s_{i+j})_{0\leq i,j\leq n}}$

This family verify :

  1. $Vect(P_0,P_1,\dots,P_n)=Vect(1,X,\dots,X^n)=\mathbb{R}_n[X] $. Because $\deg(P_n)=n$

  2. for all $P\in \mathbb{R}_{n-1}[X]$ $$ \int_0^1 P(t) P_n(t) dt = 0 $$ Because of 1.

    1. for all $n\in\mathbb{N}$ all zero of $P_n$ are real and simple.

Because if we note the point where $P_n$ change it sign in [0,1] by $( x_1, x_2, \dots x_m)$ then the polynomial $$R(x)=P_n(x) \prod_{i=1}^m (x-x_i)$$ is positive, so : $ \int_0^1 R(t)dt > 0 $ but this is impossible else if $m\geq n-1$ ( Because 2) so the set of zero of $P_n$ are all real and simple.

  1. Christoffel–Darboux formula

Gauss quadrature formula

It exists $(\lambda_i)_{i\leq n}\in[0,1]$ and $(\mu_i)_{i\leq n} > 0$ such that for all $R\in \mathbb{R}_{2n-1}[X]$: $$ \int_0^1 R(t) dt= \sum_{i=1}^n \mu_i R(\lambda_i) $$ Proof :

We have : $$ R(x)=Q_{n-1}(x) P_n(x)+ r_{n-1}(x) $$ but using Lagrange Interpolating Polynomial Formula (16) we have : $$ r_{n-1}(x)=P_n(x)\sum_{i=1}^n \frac{r_{n-1}(\lambda_i)}{P'_n(\lambda_i)(x-\lambda_i)}=P_n(x)\sum_{i=1}^n \frac{R(\lambda_i)}{P'_n(\lambda_i)(x-\lambda_i)} $$ Where $\lambda_i$ are the zeros of $P_n$. Then $$ \int_0^1 R(t) dt= \int_0^1 P_n(t)\sum_{i=1}^n \frac{R(\lambda_i)}{P'_n(\lambda_i)(t-\lambda_i)}dt=\sum_{i=1}^n R(\lambda_i)\frac{1}{P'_n(\lambda_i)}\int_0^1 \frac{P_n(t)dt}{(t-\lambda_i)} $$ We put $\mu_i=\frac{1}{P'_n(\lambda_i)}\int_0^1 \frac{P_n(t)dt}{(t-\lambda_i)}=\frac{1}{\sum_{i=0}^{n-1} |P_i(\lambda_i)|^2}>0$ (by using 4)

So the positive answer to your question is to put $\mu=\sum_{i=1}^n \mu_i \delta_{\lambda_i}$ this measure answer positively your question.

So you need to calculate the orthogonal polynomial of degree $n$ by using the determinant formula for example and calculate the set of zero of $P_n$ and to calculate $\mu_i$ you can solve the vandermonde system as seen in the remark of @Clement C.