Estimation in presence of signal dependent noise

42 Views Asked by At

Given a model as below: $$y_1 = x + \eta_1$$ $$y_2 = x + \eta_2$$

where $n_1 \sim N(0,\sigma_1^2)$ and $n_1 \sim N(0,\sigma_2^2)$, $N$ denotes a Gaussian distribution and $\sigma_1^2$ and $\sigma_2^2$ are known. It is straightforward to find the estimate for $x$ using maximum likelihood: $$L(x) = p(y|x) = p(y_1,y_2|x)$$ $$L(x) = p(y_1|x)p(y_2|x)$$ $$L(x) = N(y_1;x,\sigma_1^2)N(y_2;x,\sigma_2^2)$$ This can then be solved by taking the derivative of $\ln L(x)$ and equating it to zero to get: $$\hat{x}_{ML} = \frac{\sigma_2^2}{\sigma_1^2 + \sigma_2^2}y_1 + \frac{\sigma_1^2}{\sigma_1^2 + \sigma_2^2}y_2$$

However, for the case where the model is as below: $$y_1 = x + \eta_{h1}$$ $$y_2 = x + \eta_{h2}$$

where $\eta_{h1} \sim N(0,a_1x+b_1)$, $\eta_{h1} \sim N(0,a_2x+b_2)$ and the parameters $a_1,b_1,a_2,b_2$ are known. In this case, both error terms are now dependent on $x$. Can I still derive an estimate for $x$ through maximum likelihood? Are there other methods to derive an optimal estimate of $x$?