This is my problem
My problem is modeled by a basic Bayesian Network with only two layers. So I have parent and child nodes but the children has no children. Essentially a bipartite graph. The children depend on one or more of the parents and these edges/relations have a probability. The model is also extended by completing the bipartite graph with low probability edges to account for noisy data. So in the finished model each child $(C_x)$ has a dependency on each parent $(P_x)$.
What I want to be able to do is to infer the most probable solution by maximizing,
$$ max_{P_1,P_2,\dots,P_n}(P(P_1,P_2,\dots,P_n|C_1,C_2,\dots,C_m)) $$ where $ C_i \in \{0,1\} $, $ P_j \in \{0,1\} $. For a given observation on the state of the children.
So for a observation $O=\{C_1=0, C_2=1, C_3=1\}$
I want to calculate $$P(P_1=1,P_2=0,P_3=0|C_1=0, C_2=1, C_3=1)$$ $$P(P_1=0,P_2=1,P_3=0|C_1=0, C_2=1, C_3=1)$$ $$P(P_1=1,P_2=1,P_3=0|C_1=0, C_2=1, C_3=1)$$ and so on.
Here is how I try to solve it
I want to be thorough so I will explain how I try to solve this. I'm not sure that I'm doing it right. Please point out to med if I'm doing this wrong.
What I do first is I complete the conditional distribution tables for the child nodes. From the model I have the CPT as for a child node $C_x$ something like this, the probabilites are chosen to make the example easy.
\begin{array}{ l l l|l l } P_1 & P_2 & P_3 & P & \lnot P \\ \hline 0 & 0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 0.1 & 0.9 \\ 0 & 1 & 0 & 0.4 & 0.6 \\ 1 & 0 & 0 & 0.2 & 0.8 \\ \end{array}
And I complete this table by calculating
\begin{array}{ l l l|l l } P_1 & P_2 & P_3 & P & \lnot P \\ \hline 0 & 1 & 1 & (1- \lnot P) & 0.9 \times 0.6 \\ 1 & 0 & 1 & (1- \lnot P) & 0.9 \times 0.8 \\ 1 & 1 & 0 & (1- \lnot P) & 0.6 \times 0.8 \\ 1 & 1 & 1 & (1- \lnot P) & 0.9 \times 0.6 \times 0.8 \\ \end{array}
And then I calculate the joint conditional probability by doing something like
$$P(P_1=1,P_2=0,P_3=0|C_1=0, C_2=1, C_3=1) = $$ $$P(P_1=1,P_2=0,P_3=0|C_1=0) \times$$ $$P(P_1=1,P_2=0,P_3=0|C_2=1) \times$$ $$P(P_1=1,P_2=0,P_3=0|C_3=1)$$
Basically I'm not convinced that the last step is correct. Any help or comments is greately appreciated!
Yes, this is school related but it is not regular homework. It is a part of a thesis project I'm doing at a company.
Graph update on request

Here is a graph example of my model. The regular edges are direct dependencies and the dotted edges are noisy or guess edges added to allow for noisy data. Guess edges has a low probability of $p = 0.0001$ and the regular edges all have the probability $(1-p)$. In my problem I do not care for the probability of individual events like $P(L_1)$ or $P(C_2)$, so they are assumed to be $1$. I make an observation on the state of the variables $C_1...C_n$ and given the graph model above I want to infer to most plausible cause. The possible causes are $P_1...P_n$ or a combination of them that is most likely. Like I stated earlier on the example observation above.
Here is an example of the basic truth tables for this graph. \begin{array}{ l|l l } C_1 & T & F \\ \hline P_1 & 0.9999 & 0.0001 \\ P_2 & 0.9999 & 0.0001 \\ P_3 & 0.9999 & 0.0001 \\ \end{array}
\begin{array}{ l|l l } C_2 & T & F \\ \hline P_1 & 0.0001 & 0.9999 \\ P_2 & 0.9999 & 0.0001 \\ P_3 & 0.9999 & 0.0001 \\ \end{array}
\begin{array}{ l|l l } C_3 & T & F \\ \hline P_1 & 0.0001 & 0.9999 \\ P_2 & 0.0001 & 0.9999 \\ P_3 & 0.9999 & 0.0001 \\ \end{array}
Does that make it any clearer?
First of all, your CPT's are not correct. You must have $2^3=8$ rows for all possible combinations of $P_i=\{\text{True}, \text{False}\}$, $i=1,2,3$ which are parent nodes of each $C_i$, $i=1,2,3$. For example,
$$ \begin{array}{c|c|c|c} P_1 & P_2 & P_3 & Pr(C_1=\text{T}|P_1,P_2,P_3) \\ \hline \text{T} & \text{T} & \text{T} & 0.4 \\ \text{T} & \text{T} & \text{F} & 0.5 \\ \text{T} & \text{F} & \text{T} & 0.3 \\ \text{T} & \text{F} & \text{F} & 0.7 \\ \text{F} & \text{T} & \text{T} & 0.1 \\ \text{F} & \text{T} & \text{F} & 0.9 \\ \text{F} & \text{F} & \text{T} & 0.2 \\ \text{F} & \text{F} & \text{F} & 0.6 \\ \end{array} $$ As I understand the problem from your BN graph, you are looking for the maximum probable state of $P_i$'s given $C_i$'s,
$$\text{max } Pr(P_1,P_2,P_3|C_1,C_2,C_3)$$
Your Bayesian Network joint probability can be written as,
$$Pr_{BN} = Pr(P_1,P_2,P_3,C_1,C_2,C_3)=\prod_{i=1}^3 Pr(C_i|P_1,P_2,P_3) \tag{ 1}$$
and
$$Pr_{BN} = Pr(P_1,P_2,P_3|C_1,C_2,C_3)Pr(C_1,C_2,C_3)$$
You can find $Pr(C_1,C_2,C_3)$ by marginalizing $Pr(P_1,P_2,P_3,C_1,C_2,C_3)$.
$$Pr(C_1,C_2,C_3)=\sum_{P_1=\{\text{T},\text{F}\}}\sum_{P_2=\{\text{T},\text{F}\}}\sum_{P_3=\{\text{T},\text{F}\}}Pr(P_1,P_2,P_3,C_1,C_2,C_3)\tag{2}$$
So here are the steps:
1) Redesign CPT's correctly
2) Compute $Pr_{BN}$ using formual (1)
3) Compute $Pr(C_1,C_2,C_3)$ by marginalizing $Pr_{BN}$ using formula (2)
4) Compute $Pr(P_1,P_2,P_3|C_1,C_2,C_3) = \dfrac{Pr_{BN}}{Pr(C_1,C_2,C_3)}$
5) Compute the maximum of $Pr(P_1,P_2,P_3|C_1,C_2,C_3)$ and observe which set of $\{P_1, P_2, P_3\}$ maximizes the probability. That set is the answer.
Edit 1: The way you are estimating CPT's is not correct. I assume that the events $P_i$'s are independent and that they are conditionally independent given the event $C_i$, hence,
$$ \begin{align} Pr(P_1,P_2,P_3|C_i) & = Pr(P_1|C_i)Pr(P_2|C_i)Pr(P_3|C_i) \\ & = \frac{Pr(C_i|P_1)P(C_i|P_2)Pr(C_i|P_3)Pr(P_1)Pr(P_2)Pr(P_3)}{[Pr(C_i)]^3}\tag{3} \end{align} $$
on the other hand from Bayes rule we have,
$$ \begin{align} Pr(C_i|P_1,P_2,P_3) &= \frac{Pr(P_1,P_2,P_3|C_i)Pr(C_i)}{Pr(P_1,P_2,P_3)}\\ &=\frac{Pr(P_1,P_2,P_3|C_i)Pr(C_i)}{Pr(P_1)Pr(P_2)Pr(P_3)}\tag{4} \end{align} $$
From (3) and (4),
$$Pr(C_i|P_1,P_2,P_3) = \alpha Pr(C_i|P_1)P(C_i|P_2)Pr(C_i|P_3)\tag{5}$$
where $\alpha = \frac{1}{[Pr(C_i)]^2}$.
Your approach has two flaws. First, you don't take into account the effect of $\alpha$, and second in general,
$$Pr(C_i|P_1=1,P_2=0,P_3=0)\neq Pr(C_i|P_1=1)$$
In other words,
$$Pr(C_i=1|P_1=1,P_2=0,P_3=0)=0.2, Pr(C_i=1|P_1=0,P_2=1,P_3=0)=0.4 \not\Rightarrow Pr(C_i=1|P_1=1,P_2=1,P_3=0) = 0.2\times 0.4$$