In the theoretical analysis of some algorithm for stochastic optimization, we often need to prove that something like $$error\leq\epsilon,~~~~(under~some~conditions)$$ holds with probability at least $1-\delta$. But I found that many works use the specific number like $\delta = 1/4$ in their result even if the more general conclusion for arbitrary $\delta\in(0, 1/2]$ has been made. I was wondering that is there any particular meaning assigned to this specific number.
2026-04-11 16:50:47.1775926247
On
the meaning of with probability at least 1-\delta
414 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
2
On
As long as $1-\delta>1/2$ can be achieved, then repeatedly apply the algorithm, and by the law of large numbers, we can make the probability of error $\le \epsilon$ as large as possible, so it's not very important what $\delta$ is as long as $\delta<1/2$. (Although sometimes, it's interesting to optimize $\delta$ for specific $\epsilon$.)
This is really universal in TCS. For example in the definition of BPP, the number of $1/3$ and $2/3$ can really be replaced by $\epsilon$ and $1-\epsilon$ for any $\epsilon<1/2$.
There is no significance to $1/4$, except that it is a constant smaller than $1/2$, By the so-called "median trick," in many problems you can reduce the error probability to arbitrarily small $\delta>0$, at the cost of a factor $O(\log(1/\delta))$ in resources/time/sample complexity. [Note that this factor is not always optimal, but often is.]
From this (related) question:
The choice of $1/4$ (instead of say $1/3$ or $1/1000$) then only affects the hidden constant inside the $O(\cdot)$.