Signals extraction from different multivariate normal - with a prior and 2 biased signals (each with different precision)

34 Views Asked by At

I have a problem with signal extraction. I've been checking other posts (and multiple documents) considering similar but simpler problems. Putting everything together, I think I was able to derive the solution, but I'm unsure and looking for someone to kindly confirm to me that what I did is correct.

Suppose that I'm interested in $\theta\sim N(\mu_{\theta},\sigma^2_{\theta})$, but I only observe two imperfect (biased) signals \begin{eqnarray} s=&\theta+\epsilon\\ g=&\theta+y \end{eqnarray} where $\epsilon\sim N(\mu_{\epsilon},\sigma^2_{\epsilon})$ and $y\sim N(\mu_{y},\sigma^2_{y})$ (with $\theta$, $\epsilon$ and $y$ being independent).

What I need to compute is basically $ E(\theta\mid s,g)$. I came to the conclusion that I can obtain a sufficient statistic by averaging signals $s,g$ using weights $\frac{\sigma_{y}^{2}}{\left(\sigma_{y}^{2}+\sigma_{\epsilon}^{2}\right) }$ and $\frac{\sigma _{\epsilon }^{2}}{\left(\sigma_{y}^{2}+\sigma_{\epsilon }^{2}\right)}$.

Then, denoting by $Z=\frac{\sigma _{y}^{2}}{\left( \sigma _{y}^{2}+\sigma _{\epsilon }^{2}\right) } s+\frac{\sigma _{\epsilon }^{2}}{\left( \sigma _{y}^{2}+\sigma _{\epsilon }^{2}\right) } g=\theta+\frac{\sigma _{y}^{2}}{\left( \sigma _{y}^{2}+\sigma _{\epsilon }^{2}\right) } \epsilon+\frac{\sigma _{\epsilon }^{2}}{\left( \sigma _{y}^{2}+\sigma _{\epsilon }^{2}\right) } y$, we have that \begin{equation*} \begin{cases} Z\mid \theta \sim N\left(\frac{\sigma_{y}^{2} \mu_{\epsilon}+\sigma _{\epsilon }^{2} \mu_{y}}{\left( \sigma _{y}^{2}+\sigma _{\epsilon }^{2}\right) },\frac{\sigma _{y}^{2}\sigma _{\epsilon }^{2}}{\left( \sigma _{y}^{2}+\sigma _{\epsilon }^{2}\right) }\right) \\ Z \sim N\left(\mu_{\theta}+\frac{\sigma_{y}^{2} \mu_{\epsilon}+\sigma _{\epsilon }^{2} \mu_{y}}{\left( \sigma _{y}^{2}+\sigma _{\epsilon }^{2}\right) },\sigma^2_\theta+\frac{\sigma _{y}^{2}\sigma _{\epsilon }^{2}}{\left( \sigma _{y}^{2}+\sigma _{\epsilon }^{2}\right) }\right)\\ cov(\theta,Z)=\sigma^2_{\theta}\\ E(\theta\mid Z)= \mu_{\theta}+\frac{cov(\theta,Z)}{var(Z)}(Z-E(Z)) \end{cases} \end{equation*}

Replacing in $E(\theta\mid Z)$ I obtained: \begin{equation} E(\theta\mid s,g)= \mu_{\theta}+\frac{\sigma^2_{\theta}}{\sigma^2_\theta+\frac{\sigma_{y}^{2}\sigma_{\epsilon }^{2}}{\left( \sigma_{y}^{2}+\sigma_{\epsilon}^{2}\right) }}\left(Z-\left(\mu_{\theta}+\frac{\sigma_{y}^{2} \mu_{\epsilon}+\sigma _{\epsilon }^{2} \mu_{y}}{\left( \sigma _{y}^{2}+\sigma _{\epsilon }^{2}\right) }\right)\right) \end{equation} which, after some algebra, becomes \begin{equation} E(\theta\mid s,g)=E(\theta\mid Z)= \frac{\sigma_{y}^{2}\sigma_{\epsilon }^{2}\mu_{\theta}+\sigma^2_{\theta}\sigma_{y}^{2}\left(s-\mu_{\epsilon}\right) +\sigma^2_{\theta}\sigma_{\epsilon }^{2}\left( g-\mu_{y}\right) } { \sigma^2_\theta \left( \sigma_{y}^{2}+\sigma_{\epsilon}^{2}\right)+\sigma_{y}^{2}\sigma_{\epsilon }^{2}} \end{equation}

I'd really appreciate it if you can double-check and confirm that what I did is correct.

Thanks!

PS: a couple of posts deal with similar issues, yet not the same. In particular, I think that my results are fully consistent with https://stats.stackexchange.com/questions/179213/mean-of-two-normal-distributions and A priori normal and likelihood normal with different variances.