I am currently working with an exponential polynomial of the form $$f(x)=a_1 \cdot\exp(\,\lambda_1\cdot x\,) + a_2\cdot \exp(\,\lambda_2 \cdot x\,) + a_3.$$ In my application it is always true that $a_1>0$, $a_2>0$, and $a_3<0$, while $\lambda_1>\lambda_2>0.$ It would seem that these conditions ensure that there is always going to be a unique real $x_*$ that satisfies $f(x_*)=0.$
I was wondering whether anyone might have an idea how to approximate $x_*$ in terms of a closed-form expression. So far, I have been solving this equation with a non-linear equation solver - which turns out to be quite the bottleneck in a broader exercise.
Thank you so much.
In the most general case, there is no closed form solution but, using the given conditions, I suppose that we can make approximations which would make the solver more happy. $$f(x)=a_1\, e^{\lambda_1 x} + a_2\,e^{\lambda_2 x} + a_3$$ All derivatives are positive; so a unique solution $\quad\color{red}{\text{if}\,\,(a_1+a_2+a_3 <0)}$
The double inequality $$(a_1+a_2)\,e^{\lambda_2 x}<a_1\, e^{\lambda_1 x} + a_2\,e^{\lambda_2 x}<(a_1+a_2)\,e^{\lambda_1 x}$$ makes the root $$\color{red}{x_1}=\frac 1{\lambda_1}\log \left(-\frac{a_3}{a_1+a_2}\right)< x <\frac 1{\lambda_2}\log \left(-\frac{a_3}{a_1+a_2}\right)=\color{red}{x_2}$$
Since $f(x_2)>0$, by Darboux theorem, starting from $x_2$, Newton iterations will converge without any overshoot of the solution.
Now, we can do a bit better considering $$g(x)=\log\Big[a_1\, e^{\lambda_1 x}\Big] - \log\Big[-a_3-a_2\,e^{\lambda_2 x} \Big]=\lambda_1 x+\log(a_1)- \log\Big[-a_3-a_2\,e^{\lambda_2 x} \Big]$$ which is more linear. Drawing a straight line, another estimate to start with is $$x_3=\frac{g(x_1)\, x_2-g(x_2)\, x_1}{g(x_1)-g(x_2)}$$
Just an example : $a_1=1$, $\lambda_1=4.56$, $a_2=5.67$, $\lambda_2=1.23$, $a_3=-123456789$ give $x_1=3.66969$, $x_2=13.6047$, $x_3=4.06910$. Just for the fun, plot $f(x)$ and $g(x)$ for $x_1 \leq x \leq x_2$.
Now, Newton iterations for $g(x)$ using $x_3$ as a starting point : $$\left( \begin{array}{cc} n & x_n \\ 0 & 4.0691027223043877784 \\ 1 & 4.0858321874716327938 \\ 2 & 4.0858321871513902535 \end{array} \right)$$ which is the solution for $20$ significant figures.
Edit
May be, you could save computer resources working directly with $g(x)$ which, with $x_0=0$ gives $$x_1=\frac 1 {\lambda_1-\frac{a_2 \lambda_2}{a_2+a_3} }\log \left(-\frac{a_2+a_3}{a_1}\right)$$ For the worked example, this gives $x_1=\color{red}{4.08583}37$.
A little bit better : the first iterate of Halley method is
$$x_1=- \frac {2 g(0) g'(0)} {2 {[g'(0)]}^2 - g(0) g''(0)}$$
$$g(0)=\log \left(-\frac{a_1}{a_2+a_3}\right)\qquad g'(0)=\lambda _1-\frac{a_2 \lambda _2}{a_2+a_3}\qquad g''(0)=-\frac{a_2 a_3 \lambda _2^2}{(a_2+a_3)^2}$$