Mean Square Estimate problem

69 Views Asked by At

I have to find $\textbf{s}_{MS}$ given

$\textbf{r} = h\textbf{s}+\textbf{n}$

where $h$ is a Bernoulli random variable with $Pr(h=1)=Pr(h=0) = 1/2$ and $\textbf{s}$ and $\textbf{n}$ are independent zero-mean Gaussian random vectors with covariances $P_s$ and $P_n$.

My Attempt: I know that if $\textbf{r}$ and $\textbf{s}$ are jointly gaussian, then

\begin{equation} \textbf{s}_{MS} = \textbf{m}_s+P_{s r}P_{r}^{-1}[r-\textbf{m}_r] \ \ \ \ \ \ \ \ \ (*) \end{equation}

where $\textbf{m}_s$ and $\textbf{m}_r$ are the mean vectors of $\textbf{s}$ and $\textbf{r}$

I said that

\begin{align} P_{sr} &= var(h\textbf{s})\\ &=E^2[h]var(\textbf{s})+E^2[\textbf{s}]var(h)+var(h)var(\textbf{s})\\ &= (1/2)^2 P_s + m_s var(h) + var(h)P_s\\ &= \frac{1}{4}P_s+\frac{1}{2}(1-\frac{1}{2})P_s\\ &= \frac{1}{2}P_s \end{align}

and that \begin{align*} P_r &= var(h\textbf{s}+\textbf{n})\\ &=var(h\textbf{s})+var(\textbf{n})\\ &= \frac{1}{2}P_s+P_n \end{align*}

So plugging back into $(*)$, I get,

\begin{align*} \hat{\textbf{s}}_{MS} &= \frac{1}{2}P_s\frac{1}{\frac{1}{2}P_s+P_n}\textbf{r}\\ &= \frac{P_s\textbf{r}}{P_s+2P_n} \end{align*}

The main thing I am not sure about is if my calculations of $P_sr$ and $P_r$ are correct.

It is rather straight forward if $h$ were deterministic because then I can just use the Gaussian linear model to solve the problem and I tried adopting the method to solve that problem here but am not sure if I have made a leap somewhere.