Finding the likelihood function and Posterior Distribution

211 Views Asked by At

Let $\underline{X}=(X_1,...X_n)$ be an i.i.d. random sample from an exponential distribution, with probability density function given by

$f(x; \theta)=\lambda exp${$-\lambda x$} , x>0 where $\lambda$ is an unknown parameter taking values in $\mathbb{R^+}$

A Derive the likelihood function $L(\lambda; \underline{X})$ and derive the Fisher information $I(\lambda)$ measuring the amount of information that $\underline{X}$ carries about $\lambda$

B Given the Jeffreys prior for $\lambda$ is $\pi_J(\lambda)\propto \lambda^{-1}$, derive the posterior distribution for $\lambda$. Find the mean and variance of $\lambda| \underline{X}$

I know that for A I need to use the product up to n and then for the fisher info it is $\frac{1}{-E(l''(\lambda)}$ but I'm not sure how to do this for my given model.

1

There are 1 best solutions below

3
On BEST ANSWER

A: the likelihood is the following:

$$L(\lambda)=\lambda^n e^{-\lambda\Sigma_i X_i}$$

To calculate Fischer information, as you stated, you have to calculate

$$-n\mathbb{E}\Bigg[\frac{\partial^2}{\partial\lambda^2}\log f(x;\lambda)\Bigg]$$

thus simply:

$$\log f=\log\lambda-\lambda x$$

$$\frac{\partial^2}{\partial\lambda^2}\log f=-\frac{1}{\lambda^2}$$

So the Fischer information of the n-tuple $(X_1\dots,X_n)$ is

$$\frac{n}{\lambda^2}$$


B

The posterior is the following

$$\pi(\lambda|\mathbf{x})\propto \pi(\lambda)\cdot p(\mathbf{x}|\lambda)$$

that is

$$\pi(\lambda|\mathbf{x})\propto \lambda^{-1}\times \lambda^n e^{-\lambda\Sigma_i X_i}= \lambda^{n-1}e^{-\lambda\Sigma_i X_i}$$

We immediately recongize the kernel of a $Gamma(n;\Sigma_i X_i)$ thus

$$\mathbb{E}[\lambda|\mathbf{x}]=\frac{n}{\Sigma_i X_i}$$

and

$$\mathbb{V}[\lambda|\mathbf{x}]=\frac{n}{(\Sigma_i X_i)^2}$$