Cannot isolate $y$ since it is inside an argument to $e$

61 Views Asked by At

For the following equation, I have been able to solve for $A$ and $B$ using non-linear least squares (implemented by SciPy's curve_fit function. It's easy because $y$ has been isolated. I just give the function a few known $(x_k,y_k)$ pairs and the algorithm can then solve for $A$ and $B$ iteratively. $C_0$ is a known constant.

$$ y = Axe^{\frac{xB}{C_0}} $$

But what happens when $y$ is inside the argument to $e$? How do you go about solving this equation?

$$ y = Axe^{(x-y)\frac{B}{C_0}} $$

A pointer to some numerical technique would be nice. I am particularly looking for examples using SciPy solvers.

2

There are 2 best solutions below

0
On

Let u = x-y. Then the equation becomes $x-u = A*x*e^{u*B/C_0}$, thus $u = x-A*x*e^{u*B/C_0}$, $u = x*(1-A*e^{u*B/C_0})$, and then $x = {u \over {1-A*e^{u*B/C_0}}}$ Use this equation as a replacement and it should work

0
On

From a formal point of view, the problem is quite interesting since $$y = Axe^{k(x-y)}\implies y e^{ky}=Axe^{kx}\implies ky e^{ky}=Akxe^{kx}$$ and using Lambert function, this reduces to $$y=\frac{W\left(A k x e^{k x}\right)}{k}$$ which would require nonlinear regression which requires "reasonable" estimates of parameters $A,k$.

These parameters could easily be obtained writing $$\log(y)=\log(A)+\log(x)+k(x-y)\implies \log\left(\frac y x \right)=\log(A)+k(x-y)$$ So, for each of the $(x_i,y_i)$ data points, define $z_i=\log\left(\frac {y_i} {x_i} \right)$, $t_i=x_i-y_i$ and the linearized regression model is $$z=a+k t$$ from which $a=\log(A)$ and $k$ are easily obtained.

For illustration purposes, let us play with the following data points (they contain some significant noise). $$\left( \begin{array}{cccc} x_i & y_i & z_i & t_i \\ 5 & 2 & -0.916291 & 3 \\ 6 & 3 & -0.693147 & 3 \\ 7 & 4 & -0.559616 & 3 \\ 8 & 4 & -0.693147 & 4 \\ 9 & 5 & -0.587787 & 4 \\ 10 & 6 & -0.510826 & 4 \\ 11 & 6 & -0.606136 & 5 \\ 12 & 7 & -0.538997 & 5 \\ 13 & 8 & -0.485508 & 5 \\ 14 & 8 & -0.559616 & 6 \\ 15 & 9 & -0.510826 & 6 \end{array} \right)$$

The linear regression leads to $$z=-0.890857+0.0653653 x$$ corresponding to $A=0.410304$ and $k=0.0653653$.

From here, we can start the nonlinear regression with Lambert function. After a few iterations, we should obtain $A=0.42601$ and $k=0.0597227$ corresponding to $R^2=0.9976$ (which is quite good).

The next table compare the original data and the values from the curve fit $$\left( \begin{array}{ccc} x_i & y_i & y_i^{calc} \\ 5 & 2 & 2.47652 \\ 6 & 3 & 3.04873 \\ 7 & 4 & 3.64390 \\ 8 & 4 & 4.26083 \\ 9 & 5 & 4.89834 \\ 10 & 6 & 5.55527 \\ 11 & 6 & 6.23048 \\ 12 & 7 & 6.92290 \\ 13 & 8 & 7.63148 \\ 14 & 8 & 8.35523 \\ 15 & 9 & 9.09322 \end{array} \right)$$

As you can see, the final values of the parameters are quite different from the initial ones. This is normal because what is measured is $y_i$ and not $\log\left(\frac {y_i} {x_i} \right)$.

Edit

The problem could, in a similar way, be extended to $$y=A x^a e^{b x+ c y}$$ for which the analytical solution would be $$y=-\frac{W\left(-A c x^a e^{b x}\right)}{c}$$ To obtain estimates of the parameters, write $$\log(y)=\log(A)+a \log(x)+ b x + c y$$ which is again a linear regression. Once the parameters have been obtained, then a nonlinear regression can be launched.