Conditional expectation conditioned both to a random variable and an event

495 Views Asked by At

Consider a uniform random variable $U$ in the interval $(0,1)$. Trivially we have that

$$\mathbb{E}[X|X<\tfrac{1}{2}]=\frac{1}{4}.$$

Now, based on my intuition, I would like to say that $(X|X<\tfrac{1}{2})$ conditioned on the random variable $X,$ is a uniform random variable in $(0,\tfrac{1}{2}).$

How do we formally write and prove it, in terms of the "conditional expectation" formalism?


Another point of view on the same problem is the following: consider the following expression for two dependent uniform random variables $U,V$ in $\{1,2,\dots,n\}$:

$$\mathbb{P}(U<V)=\sum_{m}\mathbb{P}(U<m|V=m)\mathbb{P}(V=m)$$

If now I consider the conditional probability $\mathbb{P}(U<V|U),$ I would like to generalize the previous expression to:

$$\mathbb{P}(U<V|U)=\sum_{m}\mathbb{P}(U<m|V=m,U)\mathbb{P}(V=m|U)$$

but I do not know how to exactly define the expression $\mathbb{P}(U<m|V=m,U).$ Any help?

2

There are 2 best solutions below

1
On

The conditional distribution of $X$ given the event $X < 1/2$ is indeed uniform on $[0, 1/2]$.

One way is to show that for $t \in [0, 1/2]$, $$P(X \le t \mid X < 1/2) = \frac{P(X \le t, X < 1/2)}{P(X < 1/2)} = 2 P(X \le t) = 2t,$$ which is the CDF of the uniform distribution on $[0, 1/2]$.


Unfortunately, I do not understand what you mean by "$(X \mid X < 1/2)$ conditioned on the random variable $X$." Once you condition on $X$, then $X$ is no longer random. Given that you were looking for something with the uniform distribution on $[0, 1/2]$, I suspect you wanted the above.

0
On

by definition, if $P(A)>0$ , $E(X|A)=\frac{E(XI_A)}{p(A)}$ $\hspace{.5cm}$ (1)

by your note "in terms of the "conditional expectation" formalism?"

you point that conditional probability defined based on conditional expectation.

$P(X\leq a| X>3)=P(B| C)=E(I_B|C)=E(Z|C) =\frac{E(ZI_C)}{p(C)}=\frac{E(I_BI_C)}{p(C)}=\frac{E(I_{B\cap C})}{p(C)}=\frac{p(B\cap C)}{p(C)}=\frac{p(\{X\leq a\}\cap\{X>3 \} )}{p(\{X>3 \})}$

now it is clear to continue.

for second topic:

$P(U<V)=E(I_{U<V})=E(Z)=EE(Z|V) =\sum_{t=1}^{n} E(Z|V=t) p(V=t)= \sum_{t=1}^{n} E(I_{U<V}|V=t)p(V=t)= \sum_{t=1}^{n} E(I_{U<t}|V=t)p(V=t)= \sum_{t=1}^{n} P(U<t|V=t)p(V=t)$

note that by definition $E(Z|V)$ is a function of $V$

if you want generalize it if follow this step

$w\in R_w=\{(v,u)|u\in \{1,\cdots , n\},v\in \{1,\cdots , n\} \}$

$P(U<V)=E(I_{U<V})=EE(I_{U<V}|W=(V,U))= EE(I_{U<V}|W)=\sum_{w\in R_w} E(I_{U<V}|W=(V,U)=w) P(W=w) =\sum_{v=1}^{n} \sum_{u=1}^{n} E(I_{U<V}|(V=v,U=u)) P(V=v,U=u) =\sum_{v=1}^{n} \sum_{u=1}^{n} P(U<V|(V=v,U=u)) P(V=v,U=u) =\sum_{v=1}^{n} \sum_{u=1}^{n} P(U<V|(V=v,U=u)) P(V=v|U=u)P(U=u) $

But Note $P(U<V|U)$ is not generalize previous step. $P(U<V|U)=P(A|U)=E(I_A|U)$ and $E(I_A|U=u) = \sum_{v=1}^{n} I_A P(V=v|U=u)= \sum_{u<v,v=1}^{n} P(V=v|U=u)=\sum_{v=u+1}^{n} P(V=v|U=u)$

you should explain more.