Finding bayesian estimator associated to entropy distance/ kullback-leibler

46 Views Asked by At

I think I'm using more or less standard notation, please let me know if there is something that is not understood.

Im trying to find the bayesian estimator for the loss function:

$ \begin{equation} L_e(\theta, d) = \mathbb{E}_\theta \log\frac{f(x|\theta)}{f(x|d)} = \int \log\frac{f(x|\theta)}{f(x|d)} dx \end{equation} $

The bayesian estimator, $\delta ^\pi$ is defined as the estimator that minimizes the integral risk, that is $ \begin{equation} \delta^\pi (x) := \text{argmin} \, r(\pi,\delta) = \mathbb{E}^\pi R(\theta, \delta) \end{equation}

Where $R(\theta, \delta)$ denotes the frenquentist risk, that is $\mathbb{E_\theta}L_e(\theta, \delta)$ and $\mathbb{E}^\pi$ is the expectation taken w.r.t the posteriori distribution.

That is I need to minimize w.r.t. $\delta$ the following

$\int_\Theta \int \mathbb{E}_\theta \log\frac{f(x|\theta)}{f(x|d)} dx \, \pi(\theta | x)d\theta $

which is equivalent to minimizing

$\int_\Theta \log\frac{f(x|\theta)}{f(x|d)} \pi(\theta | x) d\theta = \int_\Theta \log f(x|\theta)\pi(\theta | x) - f(x|d) \pi(\theta | x) d\theta $

Intuitively i think that $\delta^\pi(x) = \mathbb{E^\pi}(\theta | x)$ but im not sure that this is correct. Any hints on how to prove it? Im a bit at loss when working with conditional distributions.

1

There are 1 best solutions below

0
On BEST ANSWER

$$\int_{\Theta} \log \frac{f(x \mid \theta)}{f(x \mid d)} \pi(\theta \mid x) \, d\theta = E^{\pi} [\log f(x \mid \theta)] - \log f(x \mid d)$$ so you should choose $d$ to maximize $\log f(x \mid d)$, i.e. maximum likelihood.