Hi I am reading Bayesian Data Analysis from Andre Gelman by myself and I am trying to solve some exercises mentioned in the text. The textbook tries to prove the following properties for m chains of MC with each of length n:
Denote $$\bar{X}_j=\frac{1}{n}\sum_{i=1}^m X_{ij}, \bar{X}=\frac{1}{m}\sum_{j=1}^m\bar{X}_j$$
$$B=\frac{n}{m-1}\sum_{j=1}^m(\bar{X}_{j}-\bar{X})^2$$
$$W=\frac{1}{m}\sum_{j=1}^m s_j^2, \text{ where } s_j^2=\frac{1}{n-1}\sum_{i=1}^n(X_{ij}-\bar{X}_j)^2$$
$$\hat{\text{var}}(X)=\frac{n-1}{n}W+\frac{1}{n}B$$
Prove
$\hat{\text{var}}(X)$ is unbiased if the starting distribution is the same as the target distribution and m parallel sequences are independent. (Hint: show that $\hat{\text{var}}(X)$ can be expressed as the average of the halved squared differences between simulation from different sequences and that each of these has expectation equal to the variance)
My work:
$$EW=\frac{1}{m}\sum_{j=1}^m E[s_j^2]=\frac{1}{m}\sum_{j=1}^m\sigma^2=\sigma^2$$
$$EB=\frac{n}{m-1}\sum_{j=1}^mE[(\bar{X}_{j}-\bar{X})^2]= \frac{n}{m-1}\sum_{j=1}^m \frac{m-1}{m}\sigma^2=n \sigma^2$$
$$E[\hat{\text{var}}(X)] = \frac{n-1}{n}EW+\frac{1}{n}EB = \frac{n-1}{n}\sigma^2+\frac{1}{n}n\sigma^2 \neq \sigma^2$$
Thanks in advance.