Non linear regression with one parameter

87 Views Asked by At

I need to solve the following exercise where I'm asked to find the coefficient $\beta $ of the following model $ y_i = \beta x_{i}^2 + \epsilon_{i}$ knowing only that $ E[x_{i}^2\epsilon_{i}] = 0$ and check whether is consistent. I know that it's like an exercise for kids but I don't know how to proceed because of the $x^2$. I wrote the equation for $\hat{\beta}= (X'^{2}X^{2})^{-1} (X^{2}Y) $ then I basically got stuck since I don't know how to proceed. Can someone explain me how to deal with this case and in general with non linear also for the part of consistency?? Thanks

1

There are 1 best solutions below

0
On

1) You said nothing about the distribution of $\epsilon$, so I will assume that the standard assumptions that the error terms are i.i.d $\epsilon \sim \mathcal{N}(0,\sigma^2)$ hold. This is crucial, as different assumptions may complicate the maximization of the Likelihood functions w.r.t to $\beta$. So, assuming the $i.i.d$ behavior of the noise terms, we can apply the classic OLS algorithm (the results will be the same as in the maximization problem).

2) This regression model, $y = \beta x^2+\epsilon$, is linear. The linearity of the regression model is defined w.r.t the coefficients. I.e., whether $\frac{\partial}{\partial \beta}y$ depends on $\beta$ or not. Hence, for all elemntary $g$, the model $y=\beta g(x) + \epsilon$ is linear.

3) As such you can simply apply the stated result $\hat{\beta} = (X'X)^{-1}X'Y $ or derive it implicitly, \begin{align} \min_{\beta} S(\beta) =\min_{\beta} \sum_{i=1}^{n}\epsilon_i^2 = \min_{\beta} \sum_{i=1}^n(y_i - \beta x_i^2)^2 , \end{align} so taking derivative w.r.t $\beta$ and equating to $0$, you will get that $$ \hat{\beta} = \frac{\sum y_i x_i^2}{\sum x_{i}^4}. $$ Note that if you want to use the matrix notation, then $\mathrm{x} = (x_1, ..., x_n)'$ and $\mathrm{y} = (y_1, ..., y_n)'$. It will yield the same result.