I have seen Chebyshev inequality applied to bound the error of an estimator wrt to its expected value. For example, the estimator of the mean of a set of values after they have been protected with a differentially private mechanism (see page 2 here). These resources often give asymptotic bounds for the number of samples required to obtain an accurate estimate.
In here, they also mention that one can obtain similar bounds using the Chernoff inequality (I presume sharper ones). I can see how to do that for the specific r.v. in the pdf. However, how can one use Chernoff when the r.v. is not as simple?
In my case, I do not have a Bernoulli r.v. but a more complex r.v. Instead, I have the product of a Bernoulli and two Rademacher mutually independent r.v.'s (see here). The corresponding generating function is
$$M_{\tilde{\mu}}(t)=M_X(t)M_{X'}(-t)$$
where
$$M_X(t)=(a+\frac{1-a}{2}(e^{\lambda t}+e^{-\lambda t}))^{n}\prod_{i=1}^n(1-a+a((1-\gamma_i)e^{\lambda t}+\gamma_i e^{-\lambda t}))$$
$$M_X'(t)=(a+\frac{1-a}{2}(e^{\lambda t}+e^{-\lambda t}))^{n}\prod_{i=1}^n(1-a+a((1-\gamma_i')e^{\lambda t}+\gamma_i' e^{-\lambda t}))$$
$$\gamma_i = \frac{1+v_i(2b-1)}{2}\quad\gamma_i' = \frac{1+v_i'(2b-1)}{2}$$
$$\lambda = \frac{1}{a(2b-1)n}$$
$$0\leq a,b\leq 1$$
$$-1\leq v_i, v_i' \leq 1\quad \forall i=1,\ldots n$$
I know that the generic Chernoff inequality gives
$$\Pr[|\tilde{\mu}-\mu|<a] \leq 1 - \frac{1}{e^{at}}(\frac{M_{\tilde{\mu}}(-t)}{e^{-\mu t}} + \frac{M_{\tilde{\mu}}(t)}{e^{\mu t}})$$
If I call $\frac{1}{\alpha} = \frac{1}{e^{at}}(\frac{M_{\tilde{\mu}}(-t)}{e^{-\mu t}} + \frac{M_{\tilde{\mu}}(t)}{e^{\mu t}})$, then I should be able to obtain an asymptotic bound of this form:
$$|\tilde{\mu}-\mu| < \mathcal O(\log(f(n)))$$
However, I get lost in the asymptotic analysis of the moment generating function. My best analysis assumes $\gamma_i=\gamma_i'=\frac{1}{2}$ and gives:
$\mathcal O(M_{\tilde{\mu}}(t)) = \mathcal O(M_{\tilde{\mu}}(-t)) \in \mathcal O((e^{1/n} + e^{-1/n})^{4n})$ and so $|\tilde{\mu}-\mu| < \mathcal O(n)$.
But that doesn't make sense because the error should decrease as $n\to \infty$.