I encounter the following question and I cannot find a correct solution. Consider a regression with cross-sectional data:
$$y_i=c+e^{m(x_i)}+u_i,$$
where for $i=1,...,n$, the scalar $y_i$ is the dependent variable, $c$ is constant number, $m(\cdot)$ is an unknown but univariate smoothing function of $x_i\in R$, and $u_i$ is mean-zero error term satisfying $E(u|x)=0$. Clearly, $E(y|x)=c+e^{m(x)}$.
Question: How to consistently identify $m(x_i)$ for all $i=1,..,n$ from the above regression?
I have tried an estimator of $m(x_i)$ through a combination of local linear estimator and nonlinear least square estimator. Specifically, I take the first-order Taylor expansion to $m(x_i)$ around a particular point $x$, so that $m(x_i) \approx m(x)+(x_i-x)m'(x) $, where $m'(x)\equiv \frac{\partial m(x_i)}{\partial x}$. Then I select the estimator of $[m(x),m'(x)]$ by $[\hat{m}(x),\hat{m}'(x)]\equiv [\hat{\alpha}_0,\hat{\alpha}_1]$, together with $\hat{c}$ the estimator of the constant term $c$, by minimizing the following objective function:
$$\sum_{i=1}^{n} \left[y_i-c-e^{\alpha_0+(x_i-x)\alpha_1} \right]^2k\left(\frac{x_i-x}{h}\right),$$
where the function $k\left(\frac{x_i-x}{h}\right)$ is a univariate kernel function, such as standard normal p.d.f.. Hence, the above estimators $(\hat{c},\hat{\alpha}_0,\hat{\alpha}_1)$ need to be obtained through nonlinear least square, where the unknown function $m(\cdot)$ is estimated by local linear estimator.
However, I find that the function $m(x_i)$ cannot be identified well through simulation, unless $c=0$, i.e., there is no constant term.
Can anyone share with me your thoughts regarding how to estimate the unknown function $m(\cdot)$ nonparametrically in the present set-up?
My question above can be actually solved using the local nonlinear least square estimator (LNLS) by Gozalo and Linton (2000), Journal of Econometrics. My above solution is not feasible because one cannot identify the unknown function in the presence of the constant term $c$.
Alternatively, instead of approximating the unknown smoothing function $m(x)$ in $e^{m(x)}$ through local linear estimator, we use some parametric regression information as a pilot estimator, say, $m(x)$ may be closer to $b_0x+b_1x^2$. If the pilot information is correct, the LNLS estimator behaves the same as a parametric estimator. If the pilot information is incorrect, then the LNLS estimator behaves as a consistent nonparametric estimator. The LNLS estimator is unbiased when the parametric information is correct, and is consistent in both cases.