Rao-Cramer lower bound for Rician distribution

537 Views Asked by At

I derived ML estimator for Rician distributed data and I am trying to show Rao-Cramer lower bound of $\hat{A}$ estimator variance.

$$f(x_k|A,\sigma) = \frac{x_k}{\sigma^2}\exp\left(-\frac{x^2_k+A^2}{2\sigma^2}\right)I_0\left(\frac{xA}{\sigma^2}\right) \tag{Rician distribution}$$

where $I_n(x) = \left( {1 \over 2}x \right )^{n} \sum_{k=0}^{\infty} \frac{\left( {1 \over 4}x^2 \right )^k}{k! (n+k)!}$ is modified Bessel function of the first kind of $n$-th order ($n \in N$)

The ML estimator, that I derived:

$$\hat{A} = \frac{1}{N} \sum_{k=1}^{N} x_k  \frac{  I_1 \left( \frac{x_k A}{\sigma^2} \right )}{   I_0 \left( \frac{x_k A}{\sigma^2} \right )}$$

Unfortunately, variance derivation looks like challenging problem. Anyone could suggest a trial?

$$\operatorname{Var}_\hat{A} = \frac{1}{N^2}E_{\hat{A}} \left( \sum_{k,l=1}^N\left( x_k-E_{\hat{A}}\left(x_k\frac{I_1}{I_0}\right) \right) \left( x_l-E_{\hat{A}}\left(x_l\frac{I_1}{I_0}\right) \right) \right)$$

where $I_0, I_1$ is of course $I_0 \left( \frac{x_k A }{\sigma^2} \right )$, $I_1 \left( \frac{x_k A }{\sigma^2} \right )$, respectively.

1

There are 1 best solutions below

2
On

I hope I have understood what you are trying to do... If you are trying to show that it achieves the Cramer-Rao Lower Bound then:

1) According to Mathematica there seems to be no closed form solution for the Fisher information, so if it does achieve the bound the variance will have no closed form. Attempting an expression for the variance is not the way forwards...

However

2) from part of a theorem of Amari[1] (I assume others too) if there is to be any unbiased estimator that achieves this bound at all, then it is necessary but not sufficient for

$$\mathbb{E}\left[\frac{\partial^2}{\partial A^2}\log f(x;A)\frac{\partial}{\partial A}\log f(x;A) + \left(\frac{\partial}{\partial A}\log f(x;A) \right)^3\right]= \Gamma^{(m)}(A) = 0$$

which, according to Mathematica, is not zero. Here's a plot of $\Gamma^{(m)}(A) $ for $\sigma = 1$

Plot of $\Gamma(A) with respect to A$

Another condition (which along with the previous one is sufficient) for the bound to hold in equality is that the distribution belong to an exponential family, i.e. be expressible in the form (for some parameters $\xi_i$)

$$ f(x;\xi)=\exp\left(C(x) + \sum_i \xi_i F_i(x) - \psi(\xi)\right)$$

Which is impossible as it would require expressing the Bessel function $I_0(kAx)$ in the product form $\exp (C(x) - \log x) \exp (\phi(A))$ which you can't.

So, you wont be able to show that $\hat{A}$ reaches the lower bound, because no unbiased estimator of A does. If it is biased (more likely than not) then you should be talking about squared errors, not the variance.

1: Methods of Information Geometry S3.5