Let $x_l$ be draws from a uniformly distributed random variable $X\in [0,1]$ for the function: $$f(X) = \frac{1}{n^2}\sum_{l=1}^n \left[\frac{1}{2}(|x_l - \frac{1}{2}| - |1 - x_l|)\right]$$ where $n$ is some positive integer. I would like to find an upper bound to the above equation. My first thought was to apply the Markov Inequality, stated as follows: if $X$ is any nonnegative random variable, then $$P(X \geq a) \leq \frac{\mathbb{E}(X)}{a}$$ Now, $x_l$ is a nonnegative r.v. but $f(\mathbf{x})$ is not (i.e: simple example if all $x_l$ happen to equal $0$) so would I approach this?
Both Markov's and Chebyshev's inequalities reference random variables not functions of random variables so I may also be off base in their applicability as well.
Uhm.. I don't know if this is what you're looking for because the question is very broad, but a possible bound is the following:
since $|1-x_l| \ge 0$ and since $|x_l - \frac{1}{2}| \le 1/2$, $\forall x_l \in [0,1]$ we have
$$ \begin{align*} f(x) \le \frac{1}{n^2} \sum_{l=1}^{n} \frac{1}{2} \bigg| x_l - \frac{1}{2} \bigg| \le \frac{1}{n^2} \sum_{l=1}^{n} \frac{1}{4} = \frac{1}{4n} \end{align*} $$