How to describe the process of adding more and more random non-overlapping intervals into a given interval?

44 Views Asked by At

Suppose I divide the interval $[0, 1]$ on the real axis by $N$ smaller randomly placed but non-overlapping intervals $[X_{i-1,N},X_{i-1,N} + \Delta_N],i=1,\cdots,N$, such that $$ N\Delta_N + \sum_{i=1}^N y_{i,N} = 1, $$ where $y_{i,N}$ represent the length of interval between intervals $[X_{i-1,N},X_{i-1,N} + \Delta_N]$ and $[X_{i,N},X_{i,N} + \Delta_N]$. (set $X_{0,N} = 0$.)

I want to study the convergence of some functions on the domain $[0, 1]$ as $N\to \infty$. My question is: How can I describe the process $N\to \infty$? In other words, are there any known mathematical models that can be used to describe the process $N\to\infty$? (I think there are maybe some models like random processes related to this process $N\to\infty$. I do not know them well now.)