The random vector $X = (X_1,X_2)^T$ has mean $\Bbb E[X] = \mu$ and covariance matrix $Cov(X) = \Sigma$. It holds that
$\Bbb E[B(X − b)] = (0,0)^T$ and $Cov(B(X − b)) = I_2$
$b= (1,2)^T$ and $B = \begin{pmatrix} -1 & 2 \\ 1 & -1 \\ \end{pmatrix} $
Compute the mean vector and covariance matrix of $X$.
After plugging in the $B$ and $b$ into the expectation, I got that the mean vector is equal to $b$. I am not sure how to proceed with the covariance matrix.
Given :- $E[B(X-b)] = (0,0)^T \implies E[BX] = [B][b] \implies E[X] = \mu_x = [B^{-1}][B][b] = [b] = (1,2)^T$
Now we need to find $:-$ $[Cov(X)] = E[XX^T] - (\mu_x)(\mu_x)^T ;$ (The unknown is $E[XX^T]$ as $\mu_x$ is a known vector).
Consider $X-b = Y$ $\implies$ $E[Y] = \mu_y = E[X-b] = E[X] - b = (0,0)^T$
Given = $[Cov(Y)] = E[YY^T] - (\mu_y)(\mu_y)^T$ but as $\mu_y = [0] \to [cov(Y)] = E[YY^T]$
$[cov(BY)] = [B][cov(Y)][B^T]$ ....(You can verify this using the covariance formula)
But this is given to be $[cov(BY)] = [I] \therefore$ substituting the values $\to [B][cov(Y)][B^T] = [I] \implies [cov(Y)] = [B^{-1}][B^T]^{-1} \implies E[YY^T] = [B^{-1}][B^T]^{-1}$.
Now expanding $E[YY^T]$ by substituting back $Y=X-b$ we get $\implies E[YY^T] = E[(X-b)(X-b)^T] = E[XX^T] - E[X][b^T] - [b]E[X^T] + [b][b^T]$
$$\therefore E[YY^T] = E[(X-b)(X-b)^T] = E[XX^T] - \mu_x[b^T] - [b](\mu_x)^T + [b][b^T] = [B^{-1}][B^T]^{-1}$$
Everything is known except $E[XX^T]$. Get that quantity and substitute in the very first equation for $[cov(X)]$.