How to derive this equation about the independence event and conditional probability

39 Views Asked by At

I read the content from page 6 to 7 in chapter one of the book "Quantum information Meets Quantum Matter",

Now suppose that the joint distribution $p_{AB}(\omega_i,\lambda_m)$ has no correlation at all, and then from Alice’s point of view, her outcome is independent of Bob’s out- come. In other words, whatever Bob’s outcome is, the probability distribution of Alice’s outcome should be just the same. This means that the conditional probability $p_{A \mid B}(\omega_i,\lambda_m)$ should not depend on $\lambda_m$, i.e., $$ p_{A \mid B}\left(\omega_i, \lambda_m\right)=p_{A \mid B}\left(\omega_i, \lambda_n\right), \forall i, m, n \quad (1.10) $$ Similarly, from Bob’s point of view, one should have $$ p_{B \mid A}\left(\lambda_m, \omega_i\right)=p_{B \mid A}\left(\lambda_m, \omega_j\right), \forall i, j, m \quad (1.11) $$ We will show that the condition of (1.10) and (1.11) implies that the joint prob- ability distribution equals the product of the probability distributions of each party, i.e., $$ p_{A B}\left(\omega_i, \lambda_m\right)=p_A\left(\omega_i\right) p_B\left(\lambda_m\right), \forall i, m \quad (1.12) $$ and vice versa. In other words, the conditions (1.11) and (1.12) are just equivalent. To see this, we first show how to go from (1.11) to (1.12). For $\forall m, i$ we have for $\forall j$, $$ p_{B \mid A}\left(\omega_i, \lambda_m\right)=p_{B \mid A}\left(\omega_j, \lambda_m\right)=\frac{p_{A B}\left(\omega_j, \lambda_m\right)}{p_A\left(\omega_j\right)} \quad (1.13) $$ Then $$ p_{B \mid A}\left(\omega_i, \lambda_m\right)=\frac{\sum_{j=0}^{d_A-1} p_{A B}\left(\omega_j, \lambda_m\right)}{\sum_{j=0}^{d_A-1} p_A\left(\omega_j\right)}=p_B\left(\lambda_m\right) \quad (1.14) $$

Here A and B are two independent objects and the possible outcomes for A is the set $\{ \omega_i, i = 0, d_A - 1 \}$, the possible outcomes for B is the set $\{ \lambda_i, i = 0, d_A - 1 \}$. The object A is for Alice and object B is for Bob.

I can derive the last equation 1.14 using the Bayes’ Rule, but how does the first equality in the last equation hold?

1

There are 1 best solutions below

0
On

My question is to derive $$ p_{B \mid A}\left(\omega_i, \lambda_m\right)=\frac{\sum_{j=0}^{d_A-1} p_{A B}\left(\omega_j, \lambda_m\right)}{\sum_{j=0}^{d_A-1} p_A\left(\omega_j\right)}, $$ given A,B are independent variables and the definition of independence of random variables is $$ p_{A \mid B}\left(\omega_i, \lambda_m\right)=p_{A \mid B}\left(\omega_i, \lambda_n\right), \forall i, m, n. $$ Starting from the definition of $P_{B|A}$, $$ p_{B \mid A}\left(\omega_i, \lambda_m\right)=\frac{p_{A B}\left(\omega_i, \lambda_m\right)}{p_A\left(\omega_i\right)}. \quad (1) $$ We multiply a unity to the (1), then, $$ p_{B \mid A}\left(\omega_i, \lambda_m\right)=\frac{\sum_j p_{A B}\left(\omega_j, \lambda_m\right)}{\sum_j p_{A B}\left(\omega_j, \lambda_m\right)}\frac{p_{A B}\left(\omega_i, \lambda_m\right)}{p_A\left(\omega_i\right)} = \frac{\sum_j p_{A B}\left(\omega_j, \lambda_m\right)}{\sum_j p_{A B}\left(\omega_j, \lambda_m\right)/p_{A B}\left(\omega_i, \lambda_m\right) p_A\left(\omega_i\right)}. \quad (2) $$ By the defintion of independence of random variables and Eq (1), $$ \frac{p_{A B}\left(\omega_i, \lambda_m\right)}{p_A\left(\omega_i\right)} = \frac{p_{A B}\left(\omega_j, \lambda_m\right)}{p_A\left(\omega_j\right)}, $$ then $$ \frac{p_A\left(\omega_j\right)}{p_A\left(\omega_i\right)} = \frac{p_{A B}\left(\omega_j, \lambda_m\right)}{p_{A B}\left(\omega_i, \lambda_m\right)} \quad (3). $$ Insert Eq (3) into Eq (2), we finish the derivation.