Bayesian estimation with a mixture of deterministic and stochastic parameters?

200 Views Asked by At

I am stuck on how to properly handle a situation where some of the parameters that must be estimated have a known distribution and some do not. That is without artificially making up some distribution for the deterministic parameters.

Is it possible to set up a minimum mean square error (MMSE) or maximum a posteriori (MAP) estimator with both deterministic and stochastic parameters? Stochastic parameters have a known distribution, deterministic must be estimated with no prior knowledge. For example, $$\boldsymbol\theta=\left[\begin{matrix}a & b\end{matrix}\right]^T $$ $$\mathbf{x}=\left[\begin{matrix}a+b+\nu_1 \\ ab+\nu_2\end{matrix}\right]$$ where $\boldsymbol\theta$ is the vector of parameters, $\mathbf{x}$ is the vector of measurements, $\nu_1\sim\mathcal{N}\left(0,\sigma_\nu^2\right)$, $\nu_2\sim\mathcal{N}\left(0,\sigma_\nu^2\right)$, and $b\sim\mathcal{N}\left(0,\sigma_b^2\right)$ but no distribution is available for $a$.

I put together a MAP estimator for a different system with a mix of stochastic and deterministic parameters by leaving out any probability information for the deterministic ones $$\hat{\boldsymbol\theta}=\arg \max_{\boldsymbol\theta} \ln\left[p(\mathbf{x}|\boldsymbol\theta)p(\theta_s)\right]$$ where $\theta_s$ was the one stochastic parameter. The estimator seemed to work well but this seems like an iffy approach and I can't really justify it to myself now.

MMSE is even more of a problem since it needs $$p(\boldsymbol\theta\vert\mathbf{x})=\frac{p(\mathbf{x}\vert\boldsymbol\theta)p(\boldsymbol\theta)}{\int p(\mathbf{x}\vert\boldsymbol\theta)p(\boldsymbol\theta) d\boldsymbol\theta}$$ and it seems really questionable to just replace $p(\mathbf{x}\vert\boldsymbol\theta)p(\boldsymbol\theta)$ with $p(\mathbf{x}\vert\boldsymbol\theta)p(\theta_s)$ and only integrate over $\theta_s$.