The mean of $\mu_{P}(\theta)=\frac{1}{Z}P(x|\theta)$

88 Views Asked by At

Consider a parametrized probability measure $P(x|\theta)$, that is for each $\theta\in[a,b]$ it is a valid probability measure on $x$. Denote $f(\theta)$ its mean and $\Sigma(\theta)$ its variance. Now for a given $x$ (which we will hold fix throughout the discussion) it can be used to define a probability measure on $\theta$ in the natural way $\mu_{P}(\theta)=\frac{1}{\int d\theta P(x|\theta)}P(x|\theta)$.

I have two questions:

  • Is anyone aware of a systematic inquiry of such construction? (indeed this situation stem from Bayesian inference, but I didn't find any such discussion in this context)
  • Can you say anything about the relation of the mean $\mathbb{E}[\mu_{P}]$ and the mean of $\mathbb{E}[\mu_{N(f(\theta),\Sigma(\theta))}]$, that is a similarly constructed measure on $\theta$ from a normal distribution with the same mean and variance?

Thanks!