An efficient stopping rule to determine the sign of the mean of an i.i.d. sequence of random variables.

49 Views Asked by At

Do there exist a family of measurable functions $(f_t^\delta)_{t \in \mathbb{N}, \delta \in (0,1)}$ and constants $C,c>0$ such that, for each $t \in \mathbb{N}$ and $\delta \in (0,1)$ we have that $f_t^\delta : \mathbb{R}^t \to\{0,1\}$ and, whenever $(X_t)_{t\in\mathbb{N}}$ is a sequence of i.i.d. $[-1,1]$-valued random variables with zero mean, we have \begin{equation} \forall \varepsilon \in \mathbb{R}\backslash\{0\}, \forall \delta \in (0,1), \mathbb{P}\bigg[\bigg\{ \varepsilon\cdot\frac{1}{T}\sum_{t=1}^T(X_t+\varepsilon) > 0\bigg\} \cap \bigg\{ T= \min \{\tau\in\mathbb{N} \mid f_\tau^\delta(X_1+\varepsilon,\dots,X_\tau+\varepsilon)=1\} \bigg\} \bigg]\ge 1- \delta \end{equation} and \begin{equation} \forall \varepsilon \in \mathbb{R}\backslash\{0\}, \forall \delta \in (0,1), \mathbb{E}\Big[\min \{\tau\in\mathbb{N} \mid f_\tau^\delta(X_1+\varepsilon,\dots,X_\tau+\varepsilon)=1\}\Big] \le \frac{C}{\varepsilon^2} \log\frac{c}{\delta} \; ? \end{equation}

Intuitively, I'm looking for a deterministic algorithm (the family $(f_t^\delta)_{t \in \mathbb{N}, \delta \in \mathbb{R}}$) that, given a confidence parameter $\delta$, provides a stopping criterion (we stop at the first time $T \in \mathbb{N}$ that we observe $f^\delta_T=1)$ that, when stops, it is because with a probability of at least $1-\delta$ the sign of $\frac{1}{T}\sum_{t=1}^T(X_t+\varepsilon)$ is the same sign of $\varepsilon$ (regardless of which was the actual value of $\varepsilon \in \mathbb{R} \backslash \{0\}$), and that this rule does the job efficiently in the number of asked queries (i.e., it doesn't waste a lot of time to be sure to have identified the sign of $\varepsilon$), at least in expectation.