I am confused about the yellow marked equation (the picture is taken from Murphy Machine Learning). I guess that the denominator comes from marginalization. Although I have already had a look at similar posts, I don't manage to understand it fully (and I am not sure if my assumptions are correct).
Questions:
(1) Does for the denominator hold that the sum equals $P(x_i|\theta)$?
(2) Holds $p(z_i=k| \theta) p(x_i|z_i=k,\theta) = p(x_i,z_i=k|\theta)$ since it is only marginalized out?
I think that something in (1) or (2) is wrong, but I don't know what. Because when I plug in my previous reformulations, this does not yield the usual Bayes theorem what I would expect.
Thanks a million in advance for your help and clarifications! :-)

Both (1) and (2) look fine. To see why (11.6) is true with your substitutions, you need to use the following version of the Bayes formula: \begin{align} P(A|B \cap C) = \frac{P(A|C)P(B|A \cap C)}{P(B|C)}. \quad (\ast) \end{align} Note that this is almost like the regular Bayes formula, except you add conditioning on the event $C$ to all the probabilities in the equation.
You can see why $(\ast)$ is true by expanding both the left-hand-side and the right-hand-side. The LHS is $$ P(A|B \cap C) = \frac{P(A \cap B \cap C)}{P(B \cap C)}. $$ The RHS is $$ \frac{P(A|C)P(B|A \cap C)}{P(B|C)} = \frac{\frac{P(A\cap C)}{P(C)} \cdot \frac{P(B \cap A \cap C)}{P(A \cap C)}}{\frac{P(B \cap C)}{P(C)}} = \frac{P(A \cap B \cap C)}{P(B \cap C)}. $$ So LHS = RHS and $(\ast)$ is therefore true. Another way to see why $(\ast)$ is true is to note that $Q(A) := P(A|C)$ is itself a probability measure. Applying the traditional Bayes formula to $Q$, you can write $$ Q(A|B) = \frac{Q(A) Q(B | A)}{Q(B)}, $$ will also give you $(\ast)$ when you take into account that $Q$ is $P$ conditioned on $C$. This is conceptually a little trickier way to think about it, but it shows how $(\ast)$ is just a version of the Bayes rule.
Anyway, if you rewrite (11.6) using the substitutions (1) and (2) in your question, you get \begin{align} p(z_i=k|x_i,\theta) = \frac{p(z_i=k|\theta) p(x_i|z_i=k,\theta)}{p(x_i|\theta)}. \end{align} If you can convince yourself that $(\ast)$ is true, you should see that this is that formula.