Optimal combination of two estimates

667 Views Asked by At

I have a set of random variables, $X_1,\dots,X_N$. They are i.i.d. Gaussian with zero mean and $w$ variance. I observe $Y_1,\dots,Y_N$ where $Y_i=\sum_{j=1}^N a_{ij} X_j+N_i$ where all $a_{ij}$s are known and $N_i$ is Gaussian noise with zero mean and $\sigma^2$ variance.

Now I make a linear minimum mean squared error (LMMSE) estimate $\tilde{X}_{i^\prime1}$ of $X_{i^\prime}$ given $Y_1,\dots Y_{i^\prime}$. Then I make a similar estimate $\tilde{X}_{{i^\prime}2}$ of $X_{i^\prime}$ given $Y_{i^\prime},\dots Y_N$.

(a) Are these two estimates $\tilde{X}_{{i^\prime}1}$, $\tilde{X}_{{i^\prime}2}$ independent? My guess is not.

(b) What is the optimal combination of these two estimates $\tilde{X}_{{i^\prime}1}$ and $\tilde{X}_{{i^\prime}2}$?

Please explain these questions for me. Thanks a lot.

PS. The question Optimally combining samples to estimate averages is similar form but different as the estimates are independent.

1

There are 1 best solutions below

3
On BEST ANSWER

I'll outline the answer.

Really, this whole problem is just a variation on the computation of linear minimum mean square error estimators. Once you understand one, you understand them all!

Recall: Suppose $A$ is a random variable you want to estimate and $B_1,\ldots,B_n$ are random variables and $\mathbb{E}(A) = \mathbb{E}(B_i) = 0$ for all $i$. You want to form an estimator $\tilde{A} = \sum \alpha_i B_i$ of $A$ for some constants $\alpha\equiv(\alpha_1,\ldots,\alpha_n)$ independent of $A$ and $B_i$. Observe \begin{eqnarray} \mathbb{E}(\tilde{A} - A)^2 &=& \mathbb{E}\left(\sum_i \alpha_i B_i - A \right)^2 \\ &=& \mathrm{Var}(A) + \sum_{i,j} \alpha_i \alpha_j \mathrm{Cov}(B_i,B_j) - 2\sum_i\mathrm{Cov}(A,B_i). \end{eqnarray} Differentiate to find the $\alpha$ yielding the minimum expected square error: $$ \alpha = M^{-1} V $$ where $M$ is the $n\times n$ matrix with $i,j$ entry $\mathrm{Cov}(B_i,B_j)$ and $V$ is the $n\times 1$ vector with $i$th entry $\mathrm{Cov}(A,B_i)$.

Okay. That was really general. It always works to find the linear estimator with minimum expected squared error (LMMSE or whatever you want to call it). No assumptions about independence or anything else. Only assumptions: $A,B_i$ are mean $0$ (can easily be extended), the $M$ matrix is non-singular (will always be true unless one of the $B_i$ is a linear combination of the others, i.e., is redundant), and the matrices $M$ and $V$ must be known (rarely true in the real world, often true in problem sets).

So in this problem, to compute $\tilde{X}_{i'1}$ and $\tilde{X}_{i'2}$, use the above method with with following calculations: \begin{eqnarray} \mathrm{Cov}(Y_i,Y_j) &=& w \sum_k a_{ik} a_{jk} + \sigma^2 \delta_{ij} \\ \mathrm{Cov}(X_{i'},Y_i) &=& w \alpha_{ii'} \end{eqnarray} Again, just carefully apply the above method. You're going to have to say something like, "The optimal $\alpha$ is $M^{-1} V$ where $M$ is the matrix ... and $V$ is the column vector ..." where the "..."s are given in terms of the $\mathrm{Cov}$ calculations given above. Not very insightful!

The only interesting thing here is that if one lets $V$ be the $n\times n$ covariance matrix with $i,j$ entry $\mathrm{Cov}(Y_i,Y_j)$, then $$ V = w aa^T + \sigma^2 I_n $$ where $a$ is the $n\times n$ matrix with $i,j$ entry $a_{ij}$.

To compute the optimal linear estimator given $\tilde{X}_{i1}$ and $\tilde{X}_{i2}$, you can use exactly the same method. However, the required calculations for $\mathrm{Cov}(\tilde{X}_{i1,2},\tilde{X}_{i1,2})$ and $\mathrm{Cov}(X_{i'},\tilde{X}_{i1,2})$ will depend on the optimal coefficients computed in the first part. And so it's possibly very messy. I my opinion, I would not be surprised if no big simplifications are possible. I don't know. Try it out. Good luck.