I'm learning the maths behind error correction codes. For this purpose I made this question for myself:
Assume there are two random bits $x_0$, $x_1$, which are both i.i.d. and have a 50% chance of being 0 or 1 (the information bits) and an additional check bit $x_2 = x_0 \mathbin{\mathsf{XOR}} x_1$). You now transfer all 3 bits through an additive white gaussian noise channel (AWGNC), one by one. The AWGNC adds noise to each bit independently and on the receiver side, you can only restore each bit with some probability, depending on what you received. You do this for each bit individually and independently of the other bits and conclude that the 3 probabilities of the bits to be 1 are $p = (0.2, 0.9, 0.7)$, I.e.
$P(x_0 = 1 \;|\; \text{given the noisy version of $x_0$ that you received}) = 0.2$ $P(x_1 = 1 \;|\; \text{given the noisy version of $x_1$ that you received}) = 0.9$ $P(x_2 = 1 \;|\; \text{given the noisy version of $x_2$ that you received}) = 0.7$
Obviously, the best guess $y$ for the sent bits $x$ is $y = (0, 1, 1)$. How to calculate the combined probability of this guess to be correct? Or of any other guess? I.e. how to calculate $P(x = (0, 1, 1) \;|\; p = (0.2, 0.9, 0.7))$ or other guesses?
Using the fact that the originally sent $x_2$ is the XOR of the other two originally sent bits, we can use the probabilities $p_0$ and $p_1$ to infer another probability about $x_2$:
$$ \begin{align} p_2' := {} &P(x_2 = 1 \mid p_0 = 0.2, p_1 = 0.9) \\[4pt] = {} & P(x_0 = 0 \mid p_0 = 0.2) \cdot P(x_1 = 1 \mid p_1 = 0.9) \\ {} & + P(x_0 = 1 \mid p_0 = 0.2) \cdot P(x_1 = 0 \mid p_1 = 0.9)\\[4pt] = {} & (1 - 0.2) \cdot 0.9 + 0.2 \cdot (1 - 0.9) \\[4pt] = {} & 0.8 \cdot 0.9 + 0.2 \cdot 0.1\\[4pt] = {} & 0.72 + 0.02\\[4pt] = {} & 0.74. \end{align} $$
How to combine the probabilities $p_2' = 0.74$ and $p_2 = 0.7$ into the combined probability $p_2'' := P(x_2 = 1 \;|\; p = (0.2, 0.9, 0.7))$?
Does the question make sense in this form? Do I need to know the distribution of the probability after the channel, given a bit's value before the channel?
To give an illustration and overview of the relations for transferring a single bit b:

Enumerate all 4 possibilities for the correct values of $x$. Apply Bayes rule.
Let $x=(x_0,x_1,x_2)$ denote the true values. Let $\tilde{x}=(\tilde{x}_0,\tilde{x}_1,\tilde{x}_2)$ denote the observed values. There are four possibilities for $x$, i.e., 000, 011, 101, 110. You know the prior on $x$, i.e., you can compute the probabilities $\Pr[x=abc]$ for each possible $abc$: specifically, for each $abc \in \{000,011,101,110\}$, you have $\Pr[x=abc] = 1/4$. You also have a model for the channel (based on the parameters of the AWGNC), i.e., you are given
$$q_i = p(\tilde{x}_i=d | x_i=a)$$
where $p(\cdot | x_i=a)$ is the pdf of $\tilde{x}_i$, conditioned on $x_i=a$.
Finally, your goal is to compute
$$\Pr[x = abc | \tilde{x} = def],$$
where $def$ are the observed values of $\tilde{x}$ and $abc$ are the hypothesized/inferred values of $x$. This conditional probability can be computed with Bayes rule, i.e.,
$$\Pr[x = abc | \tilde{x} = def] = {p(\tilde{x}=def | x=abc) \Pr[x=abc] \over \sum_{a'b'c'} p(\tilde{x}=def | x=a'b'c') \Pr[x=a'b'c']}. $$
Notice that your independence assumption tells you that
$$p(\tilde{x}=def | x=abc) = p(\tilde{x}_0=d | x_0=a) \cdot p(\tilde{x}_1=e | x_1=b) \cdot p(\tilde{x}_2=f | x_0=c).$$
As a result, all of the terms in the RHS of Bayes rule can be computed with the information given to you. This lets you compute the conditional probability that was your goal.
Work through an example to see this in action.