I'm deriving Probabilistic latent semantic analysis model. In the model, documents $d$ and words $w$ are observed. $$\begin{align} \Pr(d,w) &= \sum_{c} \Pr (d,w,c) \\ &= \sum_{c} \Pr(d) \Pr(w,c|d)\\ &= \Pr(d) \sum_{c} \frac{\Pr(w,c,d)}{\Pr(d)} \\ &= \Pr(d) \sum_{c} \Pr(w|c,d) \Pr(c|d) \end{align}$$
I used Bayes' theorem from the second line to the third. However, Wikipedia says $\Pr(d) \sum_{c} \Pr(w|c) \Pr(c|d)$. Is this because $d$ and $w$ are independent? If so, how can I know the independence from the grahical model (plate notation)?
You should try to provide more information about what exactly the random variables are (the Wikipedia article is also very poor in information). From what Wikipedia says, $d$ and $w$ are assumed to be conditionally independent given $c$. This means that $$\Pr(d,w,c) = \Pr(c)\Pr(d|c)\Pr(w|c).$$ Now using Bayes's theorem we obtain that this quantity equals $\Pr(d)\Pr(c|d)\Pr(w|c)$.
You could also continue your line of thought, because conditional independence is equivalent to saying that $\Pr(w|c,d) = \Pr(w|c)$.