I'm trying to understand conditional independence and want to show for the following bayesian network:
$$p(A,B,C,D,E) = p(A)p(B)(C|A)p(D|B)p(E|D,C)$$
that $C$ is conditionally independent of $B$ or $C$ given it's parent $A$. So I guess I want to arrive at something like
$$p(C,B|A) = p(C|A)p(B|A)$$
which is the definition of conditional independence I've seen.
I have started with in the following manner but I don't really know how to proceed.
$$p(C,B|A) \propto \sum_{E}\sum_{D} p(A,B,C,D,E) $$ $$\propto \sum_{E}\sum_{D} p(A)p(B)(C|A)p(D|B)p(E|D,C)$$
and I'm not sure what i'm allowed to pull before and into the sums or h ow to simplify further.
Multiplication distributes over Addition.
You can distribute out common factors that do not contain the variable of summation.$$\def\p{\mathop p}\begin{align*}xy_1+xy_2&=x\,(y_1+y_2)\\[1ex]\sum_{i=1}^2xy_i&=x\sum_{i=1}^2 y_i\end{align*}$$
And likewise, (with $\sum\limits_{\mathrm Y}$ understood as $\sum_{\mathrm Y\in\{Y,Y^{\small\complement}\}}$), we have:
$$\begin{align*}\sum_{\mathrm Y}\p(X)\p(\mathrm Y\mid X)&=\p(X)\sum_{\mathrm Y}\p(\mathrm Y\mid X)\end{align*}$$
And such.
This is your next step.
$$\begin{align*}\p(B,C\mid A) &=\sum_{\mathrm D,\mathrm E} \p(B,C,\mathrm D,\mathrm E\mid A)&&\text{Law of Total Probability}\\[1ex]&=\sum_{\mathrm D,\mathrm E} \p(B)\p(C\mid A)\p(\mathrm D\mid B)\p(\mathrm E\mid \mathrm D,C)&&\text{Factorisation}^\dagger\\&=\p(B)\p(C\mid A)\sum_{\mathrm D}\p(\mathrm D\mid B)\sum_{\mathrm E}\p(\mathrm E\mid \mathrm D, C)&&\text{Distribution}\\[1ex]&~~\vdots\\[2ex]&=\p(B)\p(C\mid A)\end{align*}$$
Note: $\p(B\mid A)=\p(B)$ may be proven similarly.
† $\small\p(B,C,\mathrm D,\mathrm E\mid A)=\tfrac{\p(A,B,C,\mathrm D,\mathrm E)}{\p(A)}$, so ...