I am struggling at an error propagation recently and I do not know what tools can be used in this problem. Explicitly my problem can be represented as following:
I have expression $$\varepsilon=\frac{1}{N}\left|\sum_{i=1}^N\left[f(x_i+\epsilon_i)-f(x_i)\right]\right|$$ where $\epsilon_i$ is i.i.d with expectation $\bar{\epsilon_i}=\mu$ and variance $var(\epsilon_i)=\sigma^2$ and $f(\cdot)$ is some kind of function.
Is there any method I can find the tail bound of $\varepsilon$ for universal function $f(\cdot)$, or for some specific function, e.g: $f(x)=e^{-ax}$ or $f(x)=ax^2$?
==========================================
here is some result I found:
for $f(x)=e^{-ax}$ $$\varepsilon=\frac{1}{N}\left|\sum_{i=1}^N\left[f(x_i+\epsilon_i)-f(x_i)\right]\right|\leq \frac{1}{N}\sum_i^N\left|e^{-a\epsilon_i}-1\right|$$
since $\epsilon_i$ is i.i.d, so $X_i=\left|e^{-a\epsilon_i}-1\right|$ is also i.i.d.
use central limit theorem we can find a tail bound of $\frac{1}{N}\sum_i^N\left|e^{-a\epsilon_i}-1\right|$, so we can give a tail bound on $\varepsilon$.
but this result is somehow a little trivial because the tail bound is not zero-centric, which means $\varepsilon$ has a significant probability to be finite.
so is there other result of this problem?
Suppose that $f$ is a Lipschitz function with constant $c$. For example, $sin (bx), \frac1{1+x^2}$.
In this case $$\varepsilon=\frac{1}{N}\left|\sum_{i=1}^N\left[f(x_i+\epsilon_i)-f(x_i)\right]\right| \le \frac{1}{N}\sum_{i=1}^N|f(x_i+\epsilon_i)-f(x_i)| \le c \frac{\sum_{i=1}^N|\epsilon_i| }{N}.$$
$\frac{\sum_{i=1}^N|\epsilon_i| }{N} \to E|\epsilon_1|$ a.s. and is asymptotically normal.
As $x_i$ are all distinct and $f$ is arbitrary, then we don't have any reason to wait that errors will compensate each other.
In special case $x_1 = x_2 = \ldots$ we may put $Y_i = f(x_i+\epsilon_i)-f(x_i)$ and use CLT for $Y_i$.