I am trying to understand the derivation of the main equation in the seminal paper titled Deep Neural Networks as Gaussian processes (in ICLR 2018). Following is the equation number (7), which can be found on page 5 of the paper :
$$ \require{cancel} P(z^{*}\mid D, x^{*}) = \int P(z^{*}\mid z,x,x^{*})P(z\mid D)dz = \frac{1}{P(t)}\int P(z^{*},z\mid x,x^{*})P(t\mid z)dz$$
When I try to derive it myself I obtain the following result:
$$ \begin{split} P(z^{*}\mid D, x^{*}) &= \int P(z^{*}, z\mid D,x^{*})dz\\ &= \int P(z^{*} \mid z,D,x^{*})P(z\mid D,x^{*})dz \\ &= \int P(z^{*} \mid z,D,x^{*})P(z\mid D)dz \hspace{1in} \text{Since $z$ is indpendent of $x^{*}$}.\\ &= \int P(z^{*} \mid z,x,t,x^{*})P(z\mid x,t)dz \hspace{1in} \text{Since $D=(x,t)$. }\\ &= \int P(z^{*} \mid z,t,x,x^{*})P(z\mid t,x)dz \hspace{1in} \text{Just re-arranging variables}\\ &= \int \frac{P(z^{*},z,t\mid x,x^{*})}{\cancel{P(z,t\mid x,x^{*})}} \frac{\cancel{P(z,t\mid x)}}{P(t\mid x)} dz \hspace{1in} \text{Both $z,t$ are independent of $x^{*}$} \\ &= \int \frac{P(z^{*},z,t\mid x,x^{*})}{P(t\mid x)} dz \\ &= \int \frac{P(z^{*},z\mid x,x^{*})P(t\mid z^{*},z,x,x^{*})}{P(t\mid x)} dz \hspace{1in} \text{expanding the numerator}\\ &= \int \frac{P(z^{*},z\mid x,x^{*})P(t\mid z)}{P(t\mid x)} dz \hspace{1in} \text{$t$ is independent of the others given $z$} \end{split}$$
This result differs from what authors have obtained in the paper. I have $P(t\mid x)$ in the denominator instead of $P(t)$.
Can anyone please provide some clue where my derivation went wrong?