$\text{V}(A|Y) = \text{V}(Y|A) \text{V}(Y)^{-1}\text{V}(A)$ using Bayes rule

61 Views Asked by At

Given a Bayes rule for density functions, \begin{align} f(a|y) = \frac{f(y|a)}{f(y)}f(a) \end{align} where the joint distribution is Gaussian, I aim to conclude \begin{align} \text{V}(A|Y) = \text{V}(Y|A) \text{V}(Y)^{-1}\text{V}(A) \end{align} Is it straightforward? How can we show that for the vector case, i.e., each random variable is a random vector?

The motivation is the error estimate before and after observing a single measurement. In part, $A$ is the source and $Y$ is the new measurement.

Update: The formula above is correct ONLY when applying a determinant on both sides. Consider \begin{align} V(A|Y)&= V(A) - V(A)\text{Cov}(A,Y)^TV(Y)^{-1}\text{Cov}(A,Y)V(A)\\ &= (I - V(A)\text{Cov}(A,Y)^TV(Y)^{-1}\text{Cov}(A,Y))V(A)\\ &\stackrel{(a)}= (I + V(A)\text{Cov}(A,Y)^TV(Y|A)^{-1}\text{Cov}(A,Y))^{-1}V(A) \end{align} where in $(a)$ we define $Y \triangleq \text{Cov}(Y,A)A + Z$ with $Z\sim N(0,V(Y|A)),$ and use the Matrix Inversion Lemma. By taking a determinant on both sides we get \begin{align} \det V(A|Y)&= \det (I + V(A)\text{Cov}(A,Y)^TV(Y|A)^{-1}\text{Cov}(A,Y))^{-1}\det V(A)\\ &= \det V(Y|A)\det(V(Y|A) + \text{Cov}(A,Y) V(A)\text{Cov}(A,Y)^T)^{-1}\det V(A) \end{align}