Given $n$ independent coins, where each coin has bias $1/2 \pm \varepsilon$, how many coin flips are needed to correctly determine all biases with probability at least $1 - \delta$?
For a single coin, taking an empirical average (and rounding appropriately) gives the correct sample complexity of $O(\log(1/\delta)/\varepsilon^2)$, where Pinkser's inequality and some KL divergence arguments can prove optimality. For $n$ coins, you can flip each coin $O(\log(n/\delta)/\varepsilon^2)$ times and round the empirical averages to obtain total sample complexity $O(n\log(n/\delta)/\varepsilon^2)$. I assume that this is optimal but do not see a simple proof.
If you were to restrict yourself to using empirical averages, this is a question about the maximum of a finite random process, and I think it's easy enough to show that you can't beat the union bound. I'm vaguely familiar with some tools for proving estimation/testing lower bounds (Fano's, Le Cam's, etc.), but the fact that you have to pick which coin to flip at each step makes this a little harder to reason about.