Help with LSEs and MLEs (Estimators) in regression

42 Views Asked by At

I have $\ y_i = \theta x_i^2 + \epsilon _ i $ where $\ {x_i,...,x_n} $ are constants

I think the Least Squares estimate for $\ \theta $ is $\ \sum y_i/x_i^2 $ (from 1 to n) Can anyone verify this and perhaps help me find the Maximum Likelihood estimator as well, as I am not sure how to do this?

1

There are 1 best solutions below

0
On

Lets look at this in general. First consider a single parameter family for the expected value of the data $y_i = \theta x_i^2 + \epsilon_i$. We are interested in the LSE which means we want to minimize the squared residual between this fit and the actual values (call them $z_i$). Then define $r_i = z_i - y_i$ and $S = \sum r_i^2$ \begin{align} 0 & = \frac{dS}{d\theta} \\ 0 & = \sum (z_i - \theta x_i^2) x_i^2 \\ \hat \theta_\mathrm{LSE} & = \frac{\sum z_i x_i^2}{\sum x_i^4}. \end{align}

I will assume that the observations are independent, then if the error is normally distributed the probability function is \begin{align} \mathcal L(\theta; x_i) & = \prod \frac{1}{\sqrt{2\pi \sigma^2}} \exp \left[-\frac{(z_i - \theta x_i^2)^2}{2 \sigma^2} \right] \\ \ln \mathcal L & = -\sum \left[\frac{1}{2} \ln 2 \pi \sigma^2 + \frac{(z_i - \theta x_i^2)^2}{2 \sigma^2} \right] \\ \frac{d \ln \mathcal L}{d \theta} & = \sum \frac{x_i^2 (z_i - \theta x_i^2)}{\sigma^2} = 0. \end{align} And we see that $\hat \theta_\mathrm{MLE} = \hat \theta_\mathrm{LSE}$. This is not a coincidence and for more details you can see https://en.wikipedia.org/wiki/Least_squares