What are some bounds on the size of largest subinterval in a random partition of the unit interval?

37 Views Asked by At

Suppose that we chop a carrot of length one for $n$ times independently at uniformly random positions along the carrot. How long is the largest chunk remained? I am interested to find some asymptotic lower and upper bound, that is, find any function $f_1(n)$ and $f_2(n)$ such that the size of the largest chunk is between those functions with probability $1-o(1)$ as $n$ grows.

I tried some solution, but I'm not confident about it, so can anyone help checking it? Any refutation or suggestions would be very helpful!

Suppose that $C_n$ is the size of the largest chunk after randomly chopping for $n$ times. Now let $\varepsilon=\varepsilon(n)>0$ be a sequence on $[0,1]$ such that we can define a partition $[0,1]= [0, \varepsilon) \cup [\varepsilon, 2\varepsilon) \cup \dots \cup [1-\varepsilon, 1]$, a partition of unit interval with $\varepsilon^{-1}$ classes each of which has interval length of $\varepsilon$. Define also an indicator random variable

$X_i(\varepsilon)=\mathbb{1}\big\{$there is no chops in the $i$-th partition class $[(i-1)\varepsilon,i\varepsilon]$ among all of those $n$ random chops$\big\}$

for $i=1,2,\dots, \varepsilon^{-1}$ and $X(\varepsilon)=\sum\limits_{1\le i\le \varepsilon^{-1}} X_i(\varepsilon)$.

So, I thought that $$\mathbb{P}\{C_n \ge \varepsilon(n)\}\ge \mathbb{P}\{X(\varepsilon) > 0\},$$ because if we have that some partition classes had never been chopped, then some of such partition classes must be part of a chunk with size at least $\varepsilon$ (I'm actually not very sure about this).

Now, we bound the complement event $P\{X(\varepsilon)=0\}$. First, notice that $\mathbb{E} X_i = (1-\varepsilon)^n$ so that $$\mathbb{E} X = \varepsilon^{-1}(1-\varepsilon)^n = \varepsilon^{-1}\exp(-\varepsilon n)(1+o(1)).$$ Next, for any $i\neq j$, we have that $\mathbb{E}(X_iX_j) = (1-2\varepsilon)^n$, so that $$\mathbb{E}(X^2)=\varepsilon^{-1}(1-\varepsilon)^n + {\varepsilon^{-1} \choose 2}(1-2\varepsilon)^n\le \varepsilon^{-1}\exp(-\varepsilon n) + \varepsilon^{-2}\exp(-2\varepsilon n).$$

Thus, by Chebysev inequality, $$ \begin{align*} \mathbb{P}\{X(\varepsilon)=0\} \le \mathbb{P}\{|X-\mathbb{E}X|\ge \mathbb{E}X\} & \le \frac{Var X}{(\mathbb{E}X)^2}\\ & = \frac{\mathbb{E}(X^2)}{(\mathbb{E}X)^2} - 1\\ & \le \varepsilon\exp(\varepsilon n). \end{align*} $$

By choosing $\varepsilon(n) = \frac{\log n}{n} - \frac{c \log \log n}{n}$ with some $c>1$, we have that the probability above is bounded by $O(\log^{c-1} n) =o(1)$. Therefore, $$\mathbb{P}\left\{C_n \ge \frac{\log n}{n} - \frac{c \log \log n}{n}\right\}\ge 1- o(1).$$

Next, consider the upper bound for $\mathbb{P}\{X(\varepsilon)>0\}$. If we just choose $\varepsilon(n) = \frac{\log n}{n} - d\frac{\log \log n}{n}$ with some $d<1$, then $\mathbb{E}X(\varepsilon)=\log^{d-1} n \left(1+O\left(\frac{\log \log n}{\log n}\right)\right)$. So, by Markov inequality, $$ \begin{align*} \mathbb{P}\{X(\varepsilon)>0\} & \le \mathbb{E} X = o(1) \end{align*} $$ and therefore $$\mathbb{P}\left\{C_n \le \frac{\log n}{n} - \frac{d \log \log n}{n}\right\}\ge 1- o(1).$$

So, we may conclude that for any $\delta>0$, $$\mathbb{P}\left\{\left|C_n - \left(\frac{\log n}{n} - \frac{\log \log n}{n}\right) \right|< \delta \right\}\ge 1- o(1).$$

My question: Is every step in my solution correct? I just worry that this is too good to be true. Any suggestions to provide a more elegant proof would alse be greatly appreciated. Or is it actually a well known problem that I'm unaware of?

Thank you very much in advanced!