I have three binary variables, $M,R,S$ and the following information: $$ P(M_o) = p \quad P(M_p) = 1 - p \\ P(R_p) = q \quad P(R_u) = 1 - q \\ $$ where $M$ and $R$ are independent of each other. Further, we know that for $S_i$ $$ P(S_p|M_o, R_p) = 1 \quad P(S_p|M_o, R_u) = \beta \\ P(S_u|M_o, R_p) = 0 \quad P(S_u|M_o, R_u) = 1 - \beta \\ P(S_p|M_p, R_p) = 1 \quad P(S_p|M_p, R_u) = 0 \\ P(S_u|M_p, R_p) = 0 \quad P(S_u|M_p, R_u) = 1. $$ I want to apply Bayes' rule to get at the following two probabilities: $P(M_o|S_p)$ and $P(R_p|S_p)$. For a bit of context with slight abuse of notation, think of $M_i$ as two different types, where $M_p$ always reveals information about $R_i$ (meaning $S_i = R_i$) while $M_o$ distorts the information (meaning $S_i \neq R_i$) with probability $\beta$. I want to know the probability that we are dealing with the distorting type after having observed the message $S_p$.
I am unsure both of how to approach the problem and whether it contains sufficient information to be solved.
For the first probability, consider that $$P(M_o | S_p) = \frac{P(M_o,S_p)}{P(S_p)} $$
Besides you can find $P(M_o,S_p)$ applying partitioning on $(M_o,S_p)$ and law of total probability :
$$ P(M_o,S_p) = P(M_o,S_p,R_u) + P(M_o,S_p,R_p) = $$ $$ P(S_p|M_o,R_u) P(M_o,R_u) + P(S_p|M_o,R_p) P(M_o,R_p) = $$ $$ P(S_p|M_o,R_u)P(M_o)P(R_u)+ P(S_p|M_o,R_p)P(M_o)P(R_p) = \beta p(1-q)+pq$$
Doing the same with $P(S_p)$ :
$$P(S_p) = P(M_o,S_p)+ P(M_p,S_p) = $$ $$P(M_o,S_p) + P(M_p,S_p,R_p) + P(M_p,S_p,R_u) = $$ $$\beta p(1-q) +pq+ q(1-p) = \beta p(1-q)+q$$
then, $$ \fbox{$P(M_o | S_p) = \frac{ \beta p(1-q)+pq }{ \beta p(1-q)+q }$} $$
Remark : All of this reasoning is possible because of the binary and independence properties of the variables
For the second apply same reasoning.