Assume $x\in\mathbb{N}$ obey some unknown distribution, and I can sequentially and independently acquire infinite samples of $x$. Now, given an error rate $\epsilon$ and confidence $1-\delta$, can I find a sufficient big number $m$ that if I draw $m$ samples, the probability of drawing a sample beyond the maximum value $M$ observed is less than $\epsilon$ with confidence $1-\delta$?
(That is, if I already got a sample $S_m = \{x_1, \ldots, x_m\}$, then $Pr[Pr[x_{m+1}>\text{max}\{x_i | x_i\in S_m\}]\geq\epsilon]\leq\delta$)
What you are asking for is the probability that the maximum value, $M_n$ of a sample of size $n$ is at least the $1-\epsilon$ percentile of the underling distribution.
Let each observation be $X_i \sim F_X$, then $P(X_i\leq x)=F(x) \implies P(X_i>x)=1-F(x)$
Also, let $M_n:=\max\{x_{(i)}:i\in \{1...n\}\} \implies M_n \sim \left(F_X\right)^n$
Now, we can rephrase the question in terms of the above values, specifically, we want to solve the following problem:
$N_{\epsilon,\delta}:= n\in \mathbb{N}:P(M_n>\eta_{1-\epsilon})\geq 1-\delta$, where $\eta_{1-\epsilon}$ is the $1-\epsilon$ percentile of $F_X$
We know that: $P(M_n\leq \eta_{1-\epsilon})=(1-\epsilon)^n \implies P(M_n> \eta_{1-\epsilon})=1-(1-\epsilon)^n$
Therefore, We need to solve for $n$, given $\epsilon$, so that $1-(1-\epsilon)^n = 1-\delta \implies (1-\epsilon)^n=\delta \rightarrow n=\frac{\ln(\delta)}{\ln(1-\epsilon)}\equiv N_{\epsilon,\delta}$