How to find variance of a vector?

2.5k Views Asked by At

I am given a set of measurements:

$\tilde{y_1}=x+v_1$

$\tilde{y_2}=x+v_2$

Where $v$'s are random variables with $E\{v_1\}=E\{v_2\}=E\{v_1v_2\}=0, E\{v_1^2\}=a, E\{v_2^2\}=b$. A least squares solution with basis matrix $H=[1\:1]^T$ and weighting matrix $W=0.5I$ is:

$\tilde{x}=(H^TWH)^{-1}H^TW\tilde{y}=0.5\tilde{y_1}+0.5\tilde{y_2}$

I need to find the variance of the error, i.e. $E\{(x-\tilde{x})^2\}$ but $x$ is a 2x1 vector while $\tilde{x}$ is a scalar, so I am not sure how to proceed. Any ideas?

1

There are 1 best solutions below

5
On BEST ANSWER

Due to my understanding about the problem you have understand the problem incorrectly. $x$, itself is a random variable, not a random vector, and $\bar{y}=[\bar{y_1}\space \space \space \bar{y_2}]^T$, in which each of the $\bar{y_i}$, is itself a random variable since $x$ is a random variable (not a random vector). If you proceed in this way then you have:

$E[(x-\bar{x})^2]=E[(x-0.5(2x+v_1+v_2))^2]=E[(-0.5(v_1+v_2))^2]=0.25E[v_1^2+v_2^2+2v_1v_2]=0.25(E[v_1^2]+E[v_2^2]+2E[v_1v_2])=\frac{a+b}{4}$