Show equivalence between two expressions involving independent uniforms on $[0,1]$

72 Views Asked by At

All random variables below are defined on the same probability space $(S, \mathcal{A}, \mu)$.

Assume $$ X=f(\alpha, \beta, \gamma, \delta) \text{ } \text{ } (\star) $$ for some random variables $\alpha\sim U([0,1])$, $\beta\sim U([0,1])$, $\gamma\sim U([0,1])$, $\delta\sim U([0,1])$, all independent between each other, and for some measurable function $f:[0,1]^4\rightarrow \{0,1\}$.

I want to show that $(\star)$ is equivalent to $$ X=1\{\delta\leq h(\alpha, \beta, \gamma)\} \text{ } \text{ } (\star\star) $$ for $h:[0,1]^3\rightarrow [0,1]$ prescribed by $$h(\alpha, \beta, \gamma)=\mu(f(\alpha, \beta, \gamma, \delta)=1| \alpha, \beta, \gamma)$$


My attempt: according to $(\star)$ $$ \mu(X=1)=\mu(f(\alpha, \beta, \gamma, \delta)=1) $$ According to $(\star \star)$ $$ \mu(X=1)=\mu(\delta \leq h(\alpha, \beta, \gamma))=\mu(\delta\leq \mu(f(\alpha, \beta, \gamma, \delta)=1| \alpha, \beta, \gamma)) $$ Could you explain how can the last term be equal to $\mu(f(\alpha, \beta, \gamma, \delta)=1)$?

1

There are 1 best solutions below

4
On BEST ANSWER

You are trying to show the following:

Assume that $Y$ uniform on $[0,1]^2$ and $Z$ uniform on $[0,1]$ are independent, and consider a Bernoulli random variable $X=f(Y,Z)$. Then $X=1_A$ where $A=\{Z<T\}$ for $T=P(X=1\mid Y)$.

The hypothesis that $X$ is Bernoulli is equivalent to the fact that $X=1_C$ where $C=\{f(Y,Z)=1\}$ hence the goal is to show:

Let $C$ belong to the sigma-algebra $\sigma(Y,Z)$ and define $T=P(C\mid Y)$, then $C=\{Z<T\}$.

But this has no chance to hold...

For a counterexample, let $C=\{Z>z\}$ for some $z$ in $(0,1)$, then $T=t$ almost surely, where $t=P(Z>z)$ hence $\{Z<T\}=\{Z<t\}$ and $C=\{Z>z\}\ne\{Z<t\}=\{Z<T\}$.


In your notations, consider $X=\mathbf 1_{\delta>x}$ for some $x$ in $(0,1)$ then $h(\alpha,\beta,\gamma)=\mu(\{\delta>x\})=1-x$ hence the RHS of $(\star\star)$ is $X^*=\mathbf 1_{\delta\leqslant1-x}$, which is not $X$ and not even a function of $X$ in general, even though, of course, $E(X^*)=E(X)$.