How to prove multivariate Bayesian Cramér-Rao inequality?

73 Views Asked by At

I came cross multivariate Bayesian Cramér-Rao inequality as follow recently, but I don't know how to prove it.

Let $\{f(\cdot ; \theta): \theta \in \Theta\}$ be a family of probability density functions on a sample space $\mathcal{X}$, where the parameter space $\Theta$ is an open set in $\mathbb{R}^{d}$.

One of the fundamental results in mathematical statistics is that (under weak regularity conditions) any unbiased estimator $\hat{\theta}$ of $\theta$ satisfies the multivariate Cramér-Rao inequality \begin{equation} E_{\theta}\left[(\hat{\theta}(x)-\theta)(\hat{\theta}(x)-\theta)^{T}\right] \geq I(\theta)^{-1}, \end{equation} i.e. $E_{\theta}\left[(\hat{\theta}(x)-\theta)(\hat{\theta}(x)-\theta)^{T}\right]-I(\theta)^{-1}$ is positive semi-definite.

  1. Here $E_{\theta}$ denotes expectation with respect to $f(\cdot ; \theta)$.
  2. $I(\theta)$ denotes the Fisher information matrix at $\theta$.
  3. The matrix $E_{\theta}\left[(\hat{\theta}(x)-\theta)(\hat{\theta}(x)-\theta)^{T}\right]$ is the variance matrix of $\hat{\theta}(x)$.

There are various Bayesian versions of the Cramér-Rao inequality. For the simplest of these, let $\pi$ be a proper prior distribution on $\Theta$.

Then taking the expectation of multivariate Cramér-Rao inequality over $\pi$ and using convexity of the function $X \mapsto X^{-1}$ on the set of positive-definite $d \times d$ matrices yield the Bayesian Cramér-Rao inequality $$ E_{\pi}\left[E_{\theta}\left[(\hat{\theta}(x)-\theta)(\hat{\theta}(x)-\theta)^{T}\right]\right] \geq E_{\pi}[I(\theta)]^{-1}, $$ where $E_{\pi}$ denotes expectation with respect to $\pi$.