Unbiased estimator of multivariate normal distribution

557 Views Asked by At

I am reading a paper which tries to obtain an unbiased estimate $\hat{\mathbf \theta}$ corresponding to the parameters $\mathbf{\theta}$ of covariance matrix $\mathbf{R}(\theta)$ of a multivariate Gaussian distribution denoted by $n\sim\mathcal{N}(\mathbf{0},\mathbf{R}(\theta)+\alpha_0\mathbf{I})$. It assumes $N$ observations from distribution $n$ are collected in a matrix $\mathbf{X}$ of size $N\times M$.

At some point, the authors say that assuming an unbiased estimator for the covariance matrix, we have $$ \mathbb{E}[\mathbf{R}(\theta)-\mathbf{R}(\hat{\theta})]=\mathbf{0} $$, and $$ \mathbb{E}[\mathbf{R}^{-1}(\hat{\theta})\mathbf{R}(\theta)]=\mathbf{I} .$$ I can not undersatnd how above equalities are obtained based on the assumption of unbiasedness.

1

There are 1 best solutions below

5
On

It seems like this claim does not hold. I will give a counter example now.

$\theta$: unknown constant

$\hat\theta$: unbiased estimator of $\theta$

Further let $\theta=0$. Then clearly $\mathbb{E}\hat\theta = 0$.

Consider the function $f(x) = x^2$. If we now apply this, $f(\theta) = 0^2 = 0$. However $\mathbb{E}f(\hat\theta) = \mathbb{E}\hat\theta^2 = \text{Var}(\hat\theta) \neq 0$ in general.

This counter-example shows that even if $\mathbb{E}\hat\theta = \theta$ holds $\mathbb{E}[f(\hat\theta)] \neq f(\theta)$ in general.