If $X_i\sim N(0,\frac{1}{\theta})$, find $E\left(\frac{1}{\sum_{i=1}^n X_i^2 +2}\right)$

118 Views Asked by At

The initial question states that the $X \sim \mathcal{N}(0,\frac{1}{\theta})$, where $\theta$ follows an exponential distribution with parameter equal to 1. We are asked to derive the Bayesian estimator $\hat{\theta}_n$ of $\theta$ and show it is a consistent estimator. I followed the convention by first deriving the $g(\theta|x)$, which is a Gamma$(\alpha,\beta)$ with $\alpha=\frac{n+2}{2};\beta=\frac{2}{\sum_{i=1}^n X_i^2 +2}$.

Then I derived the expected value of $g(\theta|x)$, which is $\alpha\beta$.

I intended to show that $$\lim_{n\to \infty}P(|\hat{\theta}_n-\theta|\leq \epsilon)→1$$

by showing that $$E(\hat{\theta}_n-\theta)^2=0$$ when $n\to \infty$.

My first instinct is to just calculate $E(\sum_{i=1}^n X_i^2)$ in the denominator but I am concerned that it would be inappropriate.

1

There are 1 best solutions below

0
On

I do not understand why you want to compute the expectation $E\left(\sum_{i=1}^n X_i^2\right)$.

If the posterior is as you said with \begin{equation} \alpha_n =\frac{n+2}{2};\beta_n=\frac{2}{\sum_{i=1}^n X_i^2 +2}, \end{equation} and you want to use the mean of the posterior as you Bayesian estimator, then (as you already wrote) you need to find $\alpha \beta$.

It is easy to see that \begin{equation} \alpha_n\beta_n =\frac{n+2}{\sum_{i=1}^n X_i^2 +2} = \frac{1+\frac{2}{n}}{\left(\frac{1}{n} \sum_{i=1}^n X_i^2\right) + \frac{2}{n}} \end{equation}

Using the law of large numbers and the distribution and independence assumption on $X_i$, it holds that $$ \frac{1}{n} \sum_{i=1}^n X_i^2 \to \frac{1}{\theta} \quad \text{ as } n \to \infty$$

Therefore $$ \alpha_n \beta_n \to \theta \quad \text{ as } n \to \infty$$ in probability (or almost surely, depending on whether you use the weak or the strong law of large numbers).

Therefore, the estimator is consistent.