Given $N$ coins, find a coin with minimal bias based on $N$ samples

331 Views Asked by At

General description:

Given $N$ coins $Z_1,...,Z_N$ (Bernoulli RVs), where the $i$-th coin has probability $p_i$ for "Head", I'm trying to find $\min\limits _{i\in[N]}p_{i}$.

I'm interested in a "probably accurate" estimation: Given $\epsilon,\delta \in (0,1)$, my guess $\hat{i}$ has to hold: $\mathbb{P}(p_{\hat{i}}\leq \min\limits _{i\in[N]}p_{i} + \epsilon)\geq 1-\delta$

The parameter I'm trying to minimize here is $m$, the number of samples needed of each coin in order to comply with the probabilistic condition defined with $\epsilon$ and $ \delta$.

My objective:

I must give an algorithm (or general method) which will need at most $m=\left\lceil \frac{1}{\epsilon^{2}}\cdot\frac{N}{\delta}\right\rceil $ samples.

My try:

The straightforward method is as follows:

  • Given samples $\left(\left(z_{1}^{i},...,z_{m}^{i}\right)\right)_{i=1}^{N}$, compute $\hat{p}_{i}=\frac{1}{m}\sum\limits _{j=1}^{m}z_{j}^{i}$ for all $i \in [N]$
  • return $\hat{i}$ s.t. $\hat{p}_{\hat{i}}=\min\limits _{i\in [N]}\hat{p}_{i}$

The tricky part is to prove the bound on $m$.

This is supposed to be rather simple, using basic tools such as Chebyshev's inequality or the union bound, but I threw everything I could think of at it without success.

One of my tries was:

if for all $i\in [N]$ it holds that $|\hat{p_i}-p_i|<\epsilon$, then the $\hat{i}$ we'll return is indeed $\epsilon$-close to $\min\limits _{i\in[N]}p_{i}$, but using Chebyshev's inequality to compute that I get a much smaller probability than $1-\delta$ .

Help will be very appreciated.

1

There are 1 best solutions below

1
On BEST ANSWER

Actually, using Chebyshev's inequality and the union bound do the trick once you sum up all inequalities obtained by Chebyshev's inequality.