Measure theoretic basis of joint distrib of parameters and data in Bayesian analysis

52 Views Asked by At

In Bayesian statistics you have a prior density for your parameters $\Theta$, $\pi(\theta)$ for $\theta\in\mathcal{T}\subset\mathbb{R}^k$, have the conditional distribution of the data given the parameters, $L(y/\Theta=\theta)$, for $y\in\mathcal{Y}\subset\mathbb{R}^n$. Therefore the joint distribution of the parameters and the data is $\pi(\theta)\times L(y/\Theta=\theta)$, from which the posterior $\pi(\theta/Y=y)$ can be calculated.

So far so good, all of this works if the random vectors $\Theta$ and $Y$ are defined from the same probability space $(\Omega,\mathcal{F},P)$ to some subset of $\mathbb{R}^k$ and $\mathbb{R}^n$ respectively, so that the conditional, marginal and joint measures are formally defined, and densisites wrt a sigma finite measure can be defined.

The question is, why should both $\Theta$ and $Y$ be defined from the same space.

Is there another formal way to do it?

Best regards,

Juan