I have the following objective function: \begin{equation} W( x, y) = p_1\big[p_2e^{(- p_3x - y/ p_4)}+(1-p_2)e^{(- c_1x)}\big( p_5 e^{(- y/ c_2)}+(1-p_5)e^{(y/ c_3)}\big)\big] \end{equation}
where $c_1, c_2, c_3$ are constants and $p_1,p_2,p_3,p_4,p_5$ are parameters to be estimated.
I am using a nonlinear least square method (levenberg-marquardt) to estimate my parameters. I was wondering if the above equation can be solved differently. I cannot linearise it but is there any other alternative solution to make it converge faster than using nonlinear methods?
Your function is partly linear and is better tamed by redefining the parameters as follows:
$$a=p_1p_2,b=p_1(1-p_2)p_5,c=p_1(1-p_2)(1-p_5),d=-p_3,e=-1/p_4.$$
Now we have
$$W(x,y)=a\exp(dx+ey)+b\exp(c'_1x+c'_2y)+c\exp(c'_1x+c'_3y).$$
If the parameters $d$ and $e$ were known, the model would be a multiple linear regression (with no constant term), for which we have explicit formulas.
Now for a choice of $d,e$, the linear regression yields a residue, let $R(d,e)$ and you have reduced the question to a minimization of a nonlinear bivariate function. (It is possible to compute the gradient and Hessian, but this will be a little tedious.)
You should probably focus on finding good initial values of $d,e$ and a tight range.
Also note that knowing approximate values of $a,b,c$, you can express $dx+ey$ in terms of the data, and form another linear regression problem to estimate $d,e$. From these, re-estimate $a,b,c$ and so on.
This is a mixed least-squares/fixed-point approach. I have no idea if it will converge (!)