I have been trying to understand predictive posterior distribution.The following expression is given here
$p(x^*|x)=\int_\Theta c\times p(x^*,\theta|x)d\theta=\int_\Theta c\times p(x^*|\theta)p(\theta|x)d\theta$
I understand that the conditional probability is defined as
$f(A|B)=\frac{f(A,B)}{f(B)} = \frac{f(B|A)*f(A)}{f(B)}$
Can someone explain how the predictive posterior distribution is written using conditional probability?
The first equality is the law of total probability. It is a special case of $f_X(x) = \int f_{X,Y}(x,y) \, dy$.
The second equality comes from computations with conditional densities similar to what you have written: $p(x^*, \theta \mid x) = p(x^* \mid \theta, x) p(\theta \mid x) = p(x^* \mid \theta) p(\theta \mid x)$, where the last step is due to $p(x^* \mid \theta, x) = p(x^* \mid \theta)$ by conditional independence of $x^*$ and $x$ given $\theta$.