Show that $P_{U_1|V_1}=P_{U_2|V_2}$ and $P_{V_1|U_1}=P_{V_2|U_2}$ implies that $P_{U_1}=P_{U_2}$ and $P_{V_1}=P_{V_2}$

50 Views Asked by At

Let $(U_1,V_1)$ and $(U_2,V_2)$ be two pairs of random variables, defined on the same measurable space, with correcsponding joint distributions $P_{U_1,V_1}$ and $P_{U_2,V_2}$.

Now suppose the following two conditions: \begin{align} P_{U_1|V_1}(A|v)=P_{U_2|V_2}(A|v) \end{align} for every measurable set $A$ and every $v$, and \begin{align} P_{V_1|U_1}(B|u)=P_{V_2|U_2}(B|u) \end{align} for every measurable set $B$ and every $u$.

Question: Does this imply that marginals distributions are the same? That is, $P_{U_1}=P_{U_2}$ and $P_{V_1}=P_{V_2}$.

Proof in discrete Case. Consider the case when random variables are discrete. Then, by Bayes rule

\begin{align} P_{U_1|V_1}(u|v)= \frac{P_{V_1|U_1}(v|u) P_{U_1}(u)}{P_{V_1}(v)}= \frac{P_{V_2|U_2}(v|u) P_{U_2}(u)}{P_{V_2}(v)} \end{align} using that $P_{V_1|U_1}(v|u)=P_{V_2|U_2}(v|u) $ we have that

\begin{align} \frac{P_{U_1}(u)}{P_{V_1}(v)}= \frac{ P_{U_2}(u)}{P_{V_2}(v)} \end{align}

Summing over $u$ in above we have that \begin{align} P_{V_1}(v)=P_{V_2}(v) \end{align} The same argument leads to $P_{U_1}(v)=P_{U_2}(v)$.

Question 2: How do this proof cleanly in a general case?

1

There are 1 best solutions below

3
On BEST ANSWER

Simply write $P(U_1 \in A) = E(E(1_A(U_1) \mid V_1)) = E(E(1_A(U_2) \mid V_2)) = P(U_2 \in A).$

Ammend. If the condition $P(U_1 \in A \mid V_1 = v) = P(U_2 \in A \mid V_2 = v)$ holds for all Borel $A$ and $v \in \mathbf{R}$ then, by uniqueness, $P(U_1 \in A \mid V_1)$ and $P(U_2 \in A \mid V_2)$ are, on $(\Omega, \mathscr{F}, \mathbf{P}),$ the same random variable, upto a $\mathbf{P}$-null set. Hence, their expectation on $(\Omega, \mathscr{F}, \mathbf{P})$ is the same. It is important to remember that the relation $$E(X) = \int\limits_\Omega X d\mathbf{P} = \int\limits_\mathbf{R} x dP_X$$ is true for whatever random variable $X$ ("the change of measures formula").