Question about inverse-variance weighting

383 Views Asked by At

Suppose we want to make inference on an unobserved realization $x$ of a random variable $\tilde x$, which is normally distributed with mean $\mu_x$ and variance $\sigma^2_x$. Suppose there is another random variable $\tilde y$ (whose unobserved realization we'll similarly call $y$) that is normally distributed with mean $\mu_y$ and variance $\sigma^2_y$. Let $\sigma_{xy}$ be the covariance of $\tilde x$ and $\tilde y$.

Now suppose we observe a signal on $x$, \begin{align}a=x+\tilde u,\end{align} where $\tilde u\sim\mathcal{N}(0,\phi_x^2)$, and a signal on $y$, \begin{align}b=y+\tilde v,\end{align} where $\tilde v\sim\mathcal{N}(0,\phi_y^2)$. Assume that $\tilde u$ and $\tilde v$ are independent.

What is the distribution of $x$ conditional on $a$ and $b$?

What I know so far: Using inverse-variance weighting, \begin{align}\mathbb{E}(x\,|\,a)=\frac{\frac{1}{\sigma_x^2}\mu_x+\frac{1}{\phi_x^2}a}{\frac{1}{\sigma_x^2}+\frac{1}{\phi_x^2}},\end{align} and \begin{align} \mathbb{V}\text{ar}(x\,|\,a)=\frac{1}{\frac{1}{\sigma_x^2}+\frac{1}{\phi_x^2}}. \end{align}

Since $x$ and $y$ are jointly drawn, $b$ should carry some information about $x$. Other than realizing this, I'm stuck. Any help is appreciated!

1

There are 1 best solutions below

0
On

The conditional density $f(x \mid a, b)=f(x,a, b)/f(a,b)$. Note that $f(x,a,b)=\int_{-\infty}^{\infty}f(x,y,a,b)\,\mathrm{d}y$ and $f(a,b)=\int_{-\infty}^{\infty}\int_{-\infty}^{\infty}f(x,y,a,b)\,\mathrm{d}x\mathrm{d}y$. Here $f(x,y,a,b)$ is the joint density of $(x,y,a,b)$. We have $f(x,y,a,b)=f(x,y)f(a,b\mid x, y)$, $f(a,b\mid x, y)=f(a\mid x)f(b \mid y)$ by the independence of $u$ and $v$. $f(x,y$ is the density of a bivariate normal distribution. $f(a\mid x)$ ($f(b \mid y)$) is density of a normal distribution with mean $x$ ($y$) and sd $\phi_x$ ($\phi_y$). Thus, $f(x, y, a, b)$ has a closed form. This might help you.