Let $(\Omega,\mathcal F,P)$ be a probability space. Consider a collection of bounded real random variables $X(\gamma)$, for $\gamma\in[0,1]$, defined on this probability space. Let $(\gamma_i)_{i=1}^\infty$ be a sequence of iid random variables taking values in $[0,1]$. The family $\{\gamma_i : i \in \mathbb{N}\}$ is assumed to be independent of the family $\{X(\gamma) : \gamma \in [0,1]\}$. Consider the following inequality for a given $\delta>0$:
$$\sup_{i\in\mathbb{N}}P\bigg(\Big|X(\gamma_i)-E[X(\gamma_i)|\sigma(\gamma_i)]\Big|>\delta \bigg | \sigma(\gamma_i) \bigg)\leq \sup_{\gamma \in [0,1]}P\bigg(\Big|X(\gamma)-E[X(\gamma)]\Big|>\delta \bigg) .$$
It seems to me that this inequality is true, since once we condition on $\gamma_i$, the $i$th probability on the left must appear on the right as well.
But how to show it rigorously? Any help on this is very appreciated.
EDIT: In fact I would be happy if someone could just give a rigorous proof of the statement
$$E[X(\gamma_i)|\sigma(\gamma_i)(\omega)=E[X(\gamma_i(\omega)) ,\quad \omega\in\Omega.$$
Okay, in my opinion, the crux of the problem is to make sense of all the symbols written up there.
I'm almost there but I came up with two questions I could not yet answer: Fix $i\in\mathbb{N}$. What is the object $X(\gamma_i)$? I thought it to be the function $$X(\gamma_i):\Omega \to \mathbb{R},\quad X(\gamma_i)(\omega):=X(\gamma_i(\omega))(\omega).$$ If so, why is this even a random variable, i.e. why is it measurable? Do we have other assumptions on the concrete distributions of the $\gamma_i$, for example they take values in a discrete subset of $[0,1]$?
I'm sorry, I don't have the reputation to make a comment yet, so I had to write an "answer".