Let $X$ be a random variable having a normal distribution with mean $\mu$ and standard deviation $\sigma$. The observations $(x_i; i=1,2,\cdots,n)$ have measurement errors that are normally distributed with zero mean and known standard deviation $\delta$. Use the method of maximum likelihood to estimate $\mu$ and $\sigma^2$ from erroneous observations.
How to use maximum likelihood method to find $\mu$ and $\sigma$ from erroneous observations.
53 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
You can't estimate $\mu$ or $\sigma^2$ by maximum likelihood unless you assume that $\mu$ and $\sigma^2$ are parameters that determine some distribution. For example, if it were assumed that the things whose expected value is $\mu$ and whose variance is $\sigma^2$ are normally distributed, and if in addition it were assumed that they are independent, then it can be done.
In effect you have $x_i\sim\operatorname N(\mu, \sigma^2 + \delta^2)$ and $\delta^2$ is known.
It is tempting to say the m.l.e. for $\sigma^2+\delta^2$ is the familiar $s^2$ and then by equivariance of m.l.e.s, say that the m.l.e. of $\sigma^2$ is $s^2-\delta^2.$ But what about cases where $s^2-\delta^2<0\text{ ?}$
Here we need to say that the parameter space for $\sigma^2+\delta^2$ is the interval $[\delta^2,+\infty),$ not $[0,+\infty).$
As a function of $\sigma^2+\delta^2$, the likelihood function decreases on the interval $\Big(\max\{s^2,\delta^2\},+\infty\Big).$ It increases on the interval $\Big[\delta^2, \max\{s^2,\delta^2\}\Big)$ if that is not empty.
Therefore the m.l.e. of $\sigma^2+\delta^2$ is $\max\{s^2,\delta^2\}.$
By equivariance of m.l.e.s, the m.l.e. of $\sigma^2$ is therefore $$ \widehat\sigma^2 = \max\{s^2,\delta^2\} - \delta ^2 = \max\{s^2-\delta^2, 0\}. $$
It seems what you're saying is that you have $X \sim N(\mu, \sigma^2)$, that you want to estimate $\mu$ and $\sigma$, and that you don't have samples of $X$ but rather of $Y = X + \varepsilon$ where $\varepsilon \sim N(0, \delta^2)$. Maybe you have a process generating $X$ but then there's measurement noise tacked on to that.
If that's the case you know $Y \sim N(\mu, \sigma^2 + \delta^2)$, so use the maximum likelihood approach using your measurements $y_1, \dots, y_n$ to estimate $\mu$ and $\sigma^2 + \delta^2$.