If we have a random variable $Y$ with pdf $P(Y|a,b)$, where $a$ and $b$ are parameters (with range $0$ to $\infty$).
As well as marginal posterior distributions for $a$ and $b$, these are $P(a\vert x)$ and $P(b\vert x)$, where $x$ is observed data.
Then would the predictive distribution of Y be
$$P(Y\vert x)=\int_0^\infty \left[\int_0^\infty P(Y|a,b)\cdot P(a|x) \,da\right] \cdot P(b\vert x) \,db$$
My confusion is because that we have two marginal distributions rather than a joint pdf.
Thanks in advance for any help it would be greatly appreciated.
The answer to your question is, in general "no". The correct expression would be $$ P(Y\vert x)=\int_0^\infty \int_0^\infty P(Y|a,b)\cdot P(a,b|x)\,da \,db, $$ where $P(a,b|x)$ is the joint density of $a$ and $b$ for a given $x$. If, however, $a$ and $b$ are independent, then their joint density function is equal to the product of their marginal densities: $P(a,b|x)=P(a|x)P(b|x)$, and the expression you give is correct. If $a$ and $b$ are not independent, you must have an explicit relationship between them or some other way of getting the joint density to make this calculation in the general case.